Oct 06 12:59:00 localhost kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct 06 12:59:00 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct 06 12:59:00 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 06 12:59:00 localhost kernel: BIOS-provided physical RAM map:
Oct 06 12:59:00 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct 06 12:59:00 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct 06 12:59:00 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct 06 12:59:00 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct 06 12:59:00 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct 06 12:59:00 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct 06 12:59:00 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct 06 12:59:00 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct 06 12:59:00 localhost kernel: NX (Execute Disable) protection: active
Oct 06 12:59:00 localhost kernel: APIC: Static calls initialized
Oct 06 12:59:00 localhost kernel: SMBIOS 2.8 present.
Oct 06 12:59:00 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct 06 12:59:00 localhost kernel: Hypervisor detected: KVM
Oct 06 12:59:00 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct 06 12:59:00 localhost kernel: kvm-clock: using sched offset of 4275984222 cycles
Oct 06 12:59:00 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct 06 12:59:00 localhost kernel: tsc: Detected 2800.000 MHz processor
Oct 06 12:59:00 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Oct 06 12:59:00 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Oct 06 12:59:00 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct 06 12:59:00 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct 06 12:59:00 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct 06 12:59:00 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct 06 12:59:00 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct 06 12:59:00 localhost kernel: Using GB pages for direct mapping
Oct 06 12:59:00 localhost kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct 06 12:59:00 localhost kernel: ACPI: Early table checksum verification disabled
Oct 06 12:59:00 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct 06 12:59:00 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 06 12:59:00 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 06 12:59:00 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 06 12:59:00 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct 06 12:59:00 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 06 12:59:00 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 06 12:59:00 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct 06 12:59:00 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct 06 12:59:00 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct 06 12:59:00 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct 06 12:59:00 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct 06 12:59:00 localhost kernel: No NUMA configuration found
Oct 06 12:59:00 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct 06 12:59:00 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Oct 06 12:59:00 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct 06 12:59:00 localhost kernel: Zone ranges:
Oct 06 12:59:00 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct 06 12:59:00 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct 06 12:59:00 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct 06 12:59:00 localhost kernel:   Device   empty
Oct 06 12:59:00 localhost kernel: Movable zone start for each node
Oct 06 12:59:00 localhost kernel: Early memory node ranges
Oct 06 12:59:00 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct 06 12:59:00 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct 06 12:59:00 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct 06 12:59:00 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct 06 12:59:00 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct 06 12:59:00 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct 06 12:59:00 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct 06 12:59:00 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Oct 06 12:59:00 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct 06 12:59:00 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct 06 12:59:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct 06 12:59:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct 06 12:59:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct 06 12:59:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct 06 12:59:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct 06 12:59:00 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct 06 12:59:00 localhost kernel: TSC deadline timer available
Oct 06 12:59:00 localhost kernel: CPU topo: Max. logical packages:   8
Oct 06 12:59:00 localhost kernel: CPU topo: Max. logical dies:       8
Oct 06 12:59:00 localhost kernel: CPU topo: Max. dies per package:   1
Oct 06 12:59:00 localhost kernel: CPU topo: Max. threads per core:   1
Oct 06 12:59:00 localhost kernel: CPU topo: Num. cores per package:     1
Oct 06 12:59:00 localhost kernel: CPU topo: Num. threads per package:   1
Oct 06 12:59:00 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct 06 12:59:00 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct 06 12:59:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct 06 12:59:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct 06 12:59:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct 06 12:59:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct 06 12:59:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct 06 12:59:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct 06 12:59:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct 06 12:59:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct 06 12:59:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct 06 12:59:00 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct 06 12:59:00 localhost kernel: Booting paravirtualized kernel on KVM
Oct 06 12:59:00 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct 06 12:59:00 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct 06 12:59:00 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct 06 12:59:00 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Oct 06 12:59:00 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Oct 06 12:59:00 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Oct 06 12:59:00 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 06 12:59:00 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct 06 12:59:00 localhost kernel: random: crng init done
Oct 06 12:59:00 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct 06 12:59:00 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct 06 12:59:00 localhost kernel: Fallback order for Node 0: 0 
Oct 06 12:59:00 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct 06 12:59:00 localhost kernel: Policy zone: Normal
Oct 06 12:59:00 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct 06 12:59:00 localhost kernel: software IO TLB: area num 8.
Oct 06 12:59:00 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct 06 12:59:00 localhost kernel: ftrace: allocating 49370 entries in 193 pages
Oct 06 12:59:00 localhost kernel: ftrace: allocated 193 pages with 3 groups
Oct 06 12:59:00 localhost kernel: Dynamic Preempt: voluntary
Oct 06 12:59:00 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Oct 06 12:59:00 localhost kernel: rcu:         RCU event tracing is enabled.
Oct 06 12:59:00 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct 06 12:59:00 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Oct 06 12:59:00 localhost kernel:         Rude variant of Tasks RCU enabled.
Oct 06 12:59:00 localhost kernel:         Tracing variant of Tasks RCU enabled.
Oct 06 12:59:00 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct 06 12:59:00 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct 06 12:59:00 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 06 12:59:00 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 06 12:59:00 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 06 12:59:00 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct 06 12:59:00 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct 06 12:59:00 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct 06 12:59:00 localhost kernel: Console: colour VGA+ 80x25
Oct 06 12:59:00 localhost kernel: printk: console [ttyS0] enabled
Oct 06 12:59:00 localhost kernel: ACPI: Core revision 20230331
Oct 06 12:59:00 localhost kernel: APIC: Switch to symmetric I/O mode setup
Oct 06 12:59:00 localhost kernel: x2apic enabled
Oct 06 12:59:00 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Oct 06 12:59:00 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct 06 12:59:00 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct 06 12:59:00 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct 06 12:59:00 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct 06 12:59:00 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct 06 12:59:00 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct 06 12:59:00 localhost kernel: Spectre V2 : Mitigation: Retpolines
Oct 06 12:59:00 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct 06 12:59:00 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct 06 12:59:00 localhost kernel: RETBleed: Mitigation: untrained return thunk
Oct 06 12:59:00 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct 06 12:59:00 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct 06 12:59:00 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct 06 12:59:00 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct 06 12:59:00 localhost kernel: x86/bugs: return thunk changed
Oct 06 12:59:00 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct 06 12:59:00 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct 06 12:59:00 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct 06 12:59:00 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct 06 12:59:00 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct 06 12:59:00 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct 06 12:59:00 localhost kernel: Freeing SMP alternatives memory: 40K
Oct 06 12:59:00 localhost kernel: pid_max: default: 32768 minimum: 301
Oct 06 12:59:00 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct 06 12:59:00 localhost kernel: landlock: Up and running.
Oct 06 12:59:00 localhost kernel: Yama: becoming mindful.
Oct 06 12:59:00 localhost kernel: SELinux:  Initializing.
Oct 06 12:59:00 localhost kernel: LSM support for eBPF active
Oct 06 12:59:00 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 06 12:59:00 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 06 12:59:00 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct 06 12:59:00 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct 06 12:59:00 localhost kernel: ... version:                0
Oct 06 12:59:00 localhost kernel: ... bit width:              48
Oct 06 12:59:00 localhost kernel: ... generic registers:      6
Oct 06 12:59:00 localhost kernel: ... value mask:             0000ffffffffffff
Oct 06 12:59:00 localhost kernel: ... max period:             00007fffffffffff
Oct 06 12:59:00 localhost kernel: ... fixed-purpose events:   0
Oct 06 12:59:00 localhost kernel: ... event mask:             000000000000003f
Oct 06 12:59:00 localhost kernel: signal: max sigframe size: 1776
Oct 06 12:59:00 localhost kernel: rcu: Hierarchical SRCU implementation.
Oct 06 12:59:00 localhost kernel: rcu:         Max phase no-delay instances is 400.
Oct 06 12:59:00 localhost kernel: smp: Bringing up secondary CPUs ...
Oct 06 12:59:00 localhost kernel: smpboot: x86: Booting SMP configuration:
Oct 06 12:59:00 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct 06 12:59:00 localhost kernel: smp: Brought up 1 node, 8 CPUs
Oct 06 12:59:00 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct 06 12:59:00 localhost kernel: node 0 deferred pages initialised in 27ms
Oct 06 12:59:00 localhost kernel: Memory: 7765632K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616512K reserved, 0K cma-reserved)
Oct 06 12:59:00 localhost kernel: devtmpfs: initialized
Oct 06 12:59:00 localhost kernel: x86/mm: Memory block size: 128MB
Oct 06 12:59:00 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct 06 12:59:00 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct 06 12:59:00 localhost kernel: pinctrl core: initialized pinctrl subsystem
Oct 06 12:59:00 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct 06 12:59:00 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct 06 12:59:00 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct 06 12:59:00 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct 06 12:59:00 localhost kernel: audit: initializing netlink subsys (disabled)
Oct 06 12:59:00 localhost kernel: audit: type=2000 audit(1759755538.686:1): state=initialized audit_enabled=0 res=1
Oct 06 12:59:00 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct 06 12:59:00 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct 06 12:59:00 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Oct 06 12:59:00 localhost kernel: cpuidle: using governor menu
Oct 06 12:59:00 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct 06 12:59:00 localhost kernel: PCI: Using configuration type 1 for base access
Oct 06 12:59:00 localhost kernel: PCI: Using configuration type 1 for extended access
Oct 06 12:59:00 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct 06 12:59:00 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct 06 12:59:00 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct 06 12:59:00 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct 06 12:59:00 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct 06 12:59:00 localhost kernel: Demotion targets for Node 0: null
Oct 06 12:59:00 localhost kernel: cryptd: max_cpu_qlen set to 1000
Oct 06 12:59:00 localhost kernel: ACPI: Added _OSI(Module Device)
Oct 06 12:59:00 localhost kernel: ACPI: Added _OSI(Processor Device)
Oct 06 12:59:00 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct 06 12:59:00 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct 06 12:59:00 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct 06 12:59:00 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct 06 12:59:00 localhost kernel: ACPI: Interpreter enabled
Oct 06 12:59:00 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct 06 12:59:00 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Oct 06 12:59:00 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct 06 12:59:00 localhost kernel: PCI: Using E820 reservations for host bridge windows
Oct 06 12:59:00 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct 06 12:59:00 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct 06 12:59:00 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [3] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [4] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [5] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [6] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [7] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [8] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [9] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [10] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [11] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [12] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [13] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [14] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [15] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [16] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [17] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [18] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [19] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [20] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [21] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [22] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [23] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [24] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [25] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [26] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [27] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [28] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [29] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [30] registered
Oct 06 12:59:00 localhost kernel: acpiphp: Slot [31] registered
Oct 06 12:59:00 localhost kernel: PCI host bridge to bus 0000:00
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct 06 12:59:00 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct 06 12:59:00 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct 06 12:59:00 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct 06 12:59:00 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct 06 12:59:00 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct 06 12:59:00 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct 06 12:59:00 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct 06 12:59:00 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct 06 12:59:00 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct 06 12:59:00 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct 06 12:59:00 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct 06 12:59:00 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct 06 12:59:00 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct 06 12:59:00 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct 06 12:59:00 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct 06 12:59:00 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct 06 12:59:00 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct 06 12:59:00 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct 06 12:59:00 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct 06 12:59:00 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct 06 12:59:00 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct 06 12:59:00 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct 06 12:59:00 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct 06 12:59:00 localhost kernel: iommu: Default domain type: Translated
Oct 06 12:59:00 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct 06 12:59:00 localhost kernel: SCSI subsystem initialized
Oct 06 12:59:00 localhost kernel: ACPI: bus type USB registered
Oct 06 12:59:00 localhost kernel: usbcore: registered new interface driver usbfs
Oct 06 12:59:00 localhost kernel: usbcore: registered new interface driver hub
Oct 06 12:59:00 localhost kernel: usbcore: registered new device driver usb
Oct 06 12:59:00 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Oct 06 12:59:00 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct 06 12:59:00 localhost kernel: PTP clock support registered
Oct 06 12:59:00 localhost kernel: EDAC MC: Ver: 3.0.0
Oct 06 12:59:00 localhost kernel: NetLabel: Initializing
Oct 06 12:59:00 localhost kernel: NetLabel:  domain hash size = 128
Oct 06 12:59:00 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct 06 12:59:00 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Oct 06 12:59:00 localhost kernel: PCI: Using ACPI for IRQ routing
Oct 06 12:59:00 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Oct 06 12:59:00 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Oct 06 12:59:00 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Oct 06 12:59:00 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct 06 12:59:00 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct 06 12:59:00 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct 06 12:59:00 localhost kernel: vgaarb: loaded
Oct 06 12:59:00 localhost kernel: clocksource: Switched to clocksource kvm-clock
Oct 06 12:59:00 localhost kernel: VFS: Disk quotas dquot_6.6.0
Oct 06 12:59:00 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct 06 12:59:00 localhost kernel: pnp: PnP ACPI init
Oct 06 12:59:00 localhost kernel: pnp 00:03: [dma 2]
Oct 06 12:59:00 localhost kernel: pnp: PnP ACPI: found 5 devices
Oct 06 12:59:00 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct 06 12:59:00 localhost kernel: NET: Registered PF_INET protocol family
Oct 06 12:59:00 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct 06 12:59:00 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct 06 12:59:00 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct 06 12:59:00 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct 06 12:59:00 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct 06 12:59:00 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct 06 12:59:00 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct 06 12:59:00 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 06 12:59:00 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 06 12:59:00 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct 06 12:59:00 localhost kernel: NET: Registered PF_XDP protocol family
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct 06 12:59:00 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct 06 12:59:00 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct 06 12:59:00 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct 06 12:59:00 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 101641 usecs
Oct 06 12:59:00 localhost kernel: PCI: CLS 0 bytes, default 64
Oct 06 12:59:00 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct 06 12:59:00 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct 06 12:59:00 localhost kernel: ACPI: bus type thunderbolt registered
Oct 06 12:59:00 localhost kernel: Trying to unpack rootfs image as initramfs...
Oct 06 12:59:00 localhost kernel: Initialise system trusted keyrings
Oct 06 12:59:00 localhost kernel: Key type blacklist registered
Oct 06 12:59:00 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct 06 12:59:00 localhost kernel: zbud: loaded
Oct 06 12:59:00 localhost kernel: integrity: Platform Keyring initialized
Oct 06 12:59:00 localhost kernel: integrity: Machine keyring initialized
Oct 06 12:59:00 localhost kernel: Freeing initrd memory: 86104K
Oct 06 12:59:00 localhost kernel: NET: Registered PF_ALG protocol family
Oct 06 12:59:00 localhost kernel: xor: automatically using best checksumming function   avx       
Oct 06 12:59:00 localhost kernel: Key type asymmetric registered
Oct 06 12:59:00 localhost kernel: Asymmetric key parser 'x509' registered
Oct 06 12:59:00 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct 06 12:59:00 localhost kernel: io scheduler mq-deadline registered
Oct 06 12:59:00 localhost kernel: io scheduler kyber registered
Oct 06 12:59:00 localhost kernel: io scheduler bfq registered
Oct 06 12:59:00 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct 06 12:59:00 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct 06 12:59:00 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct 06 12:59:00 localhost kernel: ACPI: button: Power Button [PWRF]
Oct 06 12:59:00 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct 06 12:59:00 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct 06 12:59:00 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct 06 12:59:00 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct 06 12:59:00 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct 06 12:59:00 localhost kernel: Non-volatile memory driver v1.3
Oct 06 12:59:00 localhost kernel: rdac: device handler registered
Oct 06 12:59:00 localhost kernel: hp_sw: device handler registered
Oct 06 12:59:00 localhost kernel: emc: device handler registered
Oct 06 12:59:00 localhost kernel: alua: device handler registered
Oct 06 12:59:00 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct 06 12:59:00 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct 06 12:59:00 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct 06 12:59:00 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct 06 12:59:00 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct 06 12:59:00 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct 06 12:59:00 localhost kernel: usb usb1: Product: UHCI Host Controller
Oct 06 12:59:00 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct 06 12:59:00 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct 06 12:59:00 localhost kernel: hub 1-0:1.0: USB hub found
Oct 06 12:59:00 localhost kernel: hub 1-0:1.0: 2 ports detected
Oct 06 12:59:00 localhost kernel: usbcore: registered new interface driver usbserial_generic
Oct 06 12:59:00 localhost kernel: usbserial: USB Serial support registered for generic
Oct 06 12:59:00 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct 06 12:59:00 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct 06 12:59:00 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct 06 12:59:00 localhost kernel: mousedev: PS/2 mouse device common for all mice
Oct 06 12:59:00 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Oct 06 12:59:00 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct 06 12:59:00 localhost kernel: rtc_cmos 00:04: registered as rtc0
Oct 06 12:59:00 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-10-06T12:58:59 UTC (1759755539)
Oct 06 12:59:00 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct 06 12:59:00 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct 06 12:59:00 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct 06 12:59:00 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Oct 06 12:59:00 localhost kernel: usbcore: registered new interface driver usbhid
Oct 06 12:59:00 localhost kernel: usbhid: USB HID core driver
Oct 06 12:59:00 localhost kernel: drop_monitor: Initializing network drop monitor service
Oct 06 12:59:00 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct 06 12:59:00 localhost kernel: Initializing XFRM netlink socket
Oct 06 12:59:00 localhost kernel: NET: Registered PF_INET6 protocol family
Oct 06 12:59:00 localhost kernel: Segment Routing with IPv6
Oct 06 12:59:00 localhost kernel: NET: Registered PF_PACKET protocol family
Oct 06 12:59:00 localhost kernel: mpls_gso: MPLS GSO support
Oct 06 12:59:00 localhost kernel: IPI shorthand broadcast: enabled
Oct 06 12:59:00 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Oct 06 12:59:00 localhost kernel: AES CTR mode by8 optimization enabled
Oct 06 12:59:00 localhost kernel: sched_clock: Marking stable (1272008260, 148526460)->(1550644130, -130109410)
Oct 06 12:59:00 localhost kernel: registered taskstats version 1
Oct 06 12:59:00 localhost kernel: Loading compiled-in X.509 certificates
Oct 06 12:59:00 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct 06 12:59:00 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct 06 12:59:00 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct 06 12:59:00 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct 06 12:59:00 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct 06 12:59:00 localhost kernel: Demotion targets for Node 0: null
Oct 06 12:59:00 localhost kernel: page_owner is disabled
Oct 06 12:59:00 localhost kernel: Key type .fscrypt registered
Oct 06 12:59:00 localhost kernel: Key type fscrypt-provisioning registered
Oct 06 12:59:00 localhost kernel: Key type big_key registered
Oct 06 12:59:00 localhost kernel: Key type encrypted registered
Oct 06 12:59:00 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Oct 06 12:59:00 localhost kernel: Loading compiled-in module X.509 certificates
Oct 06 12:59:00 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct 06 12:59:00 localhost kernel: ima: Allocated hash algorithm: sha256
Oct 06 12:59:00 localhost kernel: ima: No architecture policies found
Oct 06 12:59:00 localhost kernel: evm: Initialising EVM extended attributes:
Oct 06 12:59:00 localhost kernel: evm: security.selinux
Oct 06 12:59:00 localhost kernel: evm: security.SMACK64 (disabled)
Oct 06 12:59:00 localhost kernel: evm: security.SMACK64EXEC (disabled)
Oct 06 12:59:00 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct 06 12:59:00 localhost kernel: evm: security.SMACK64MMAP (disabled)
Oct 06 12:59:00 localhost kernel: evm: security.apparmor (disabled)
Oct 06 12:59:00 localhost kernel: evm: security.ima
Oct 06 12:59:00 localhost kernel: evm: security.capability
Oct 06 12:59:00 localhost kernel: evm: HMAC attrs: 0x1
Oct 06 12:59:00 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct 06 12:59:00 localhost kernel: Running certificate verification RSA selftest
Oct 06 12:59:00 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct 06 12:59:00 localhost kernel: Running certificate verification ECDSA selftest
Oct 06 12:59:00 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct 06 12:59:00 localhost kernel: clk: Disabling unused clocks
Oct 06 12:59:00 localhost kernel: Freeing unused decrypted memory: 2028K
Oct 06 12:59:00 localhost kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct 06 12:59:00 localhost kernel: Write protecting the kernel read-only data: 30720k
Oct 06 12:59:00 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct 06 12:59:00 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct 06 12:59:00 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct 06 12:59:00 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Oct 06 12:59:00 localhost kernel: usb 1-1: Manufacturer: QEMU
Oct 06 12:59:00 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct 06 12:59:00 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct 06 12:59:00 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct 06 12:59:00 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct 06 12:59:00 localhost kernel: Run /init as init process
Oct 06 12:59:00 localhost kernel:   with arguments:
Oct 06 12:59:00 localhost kernel:     /init
Oct 06 12:59:00 localhost kernel:   with environment:
Oct 06 12:59:00 localhost kernel:     HOME=/
Oct 06 12:59:00 localhost kernel:     TERM=linux
Oct 06 12:59:00 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64
Oct 06 12:59:00 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 06 12:59:00 localhost systemd[1]: Detected virtualization kvm.
Oct 06 12:59:00 localhost systemd[1]: Detected architecture x86-64.
Oct 06 12:59:00 localhost systemd[1]: Running in initrd.
Oct 06 12:59:00 localhost systemd[1]: No hostname configured, using default hostname.
Oct 06 12:59:00 localhost systemd[1]: Hostname set to <localhost>.
Oct 06 12:59:00 localhost systemd[1]: Initializing machine ID from VM UUID.
Oct 06 12:59:00 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Oct 06 12:59:00 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 06 12:59:00 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 06 12:59:00 localhost systemd[1]: Reached target Initrd /usr File System.
Oct 06 12:59:00 localhost systemd[1]: Reached target Local File Systems.
Oct 06 12:59:00 localhost systemd[1]: Reached target Path Units.
Oct 06 12:59:00 localhost systemd[1]: Reached target Slice Units.
Oct 06 12:59:00 localhost systemd[1]: Reached target Swaps.
Oct 06 12:59:00 localhost systemd[1]: Reached target Timer Units.
Oct 06 12:59:00 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 06 12:59:00 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Oct 06 12:59:00 localhost systemd[1]: Listening on Journal Socket.
Oct 06 12:59:00 localhost systemd[1]: Listening on udev Control Socket.
Oct 06 12:59:00 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 06 12:59:00 localhost systemd[1]: Reached target Socket Units.
Oct 06 12:59:00 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 06 12:59:00 localhost systemd[1]: Starting Journal Service...
Oct 06 12:59:00 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 06 12:59:00 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 06 12:59:00 localhost systemd[1]: Starting Create System Users...
Oct 06 12:59:00 localhost systemd[1]: Starting Setup Virtual Console...
Oct 06 12:59:00 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 06 12:59:00 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 06 12:59:00 localhost systemd[1]: Finished Create System Users.
Oct 06 12:59:00 localhost systemd-journald[309]: Journal started
Oct 06 12:59:00 localhost systemd-journald[309]: Runtime Journal (/run/log/journal/3bc25ed832494f45b2833dd869d73ce5) is 8.0M, max 153.5M, 145.5M free.
Oct 06 12:59:00 localhost systemd-sysusers[314]: Creating group 'users' with GID 100.
Oct 06 12:59:00 localhost systemd-sysusers[314]: Creating group 'dbus' with GID 81.
Oct 06 12:59:00 localhost systemd-sysusers[314]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct 06 12:59:00 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 06 12:59:00 localhost systemd[1]: Started Journal Service.
Oct 06 12:59:00 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 06 12:59:00 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 06 12:59:00 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 06 12:59:00 localhost systemd[1]: Finished Setup Virtual Console.
Oct 06 12:59:00 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct 06 12:59:00 localhost systemd[1]: Starting dracut cmdline hook...
Oct 06 12:59:00 localhost dracut-cmdline[329]: dracut-9 dracut-057-102.git20250818.el9
Oct 06 12:59:00 localhost dracut-cmdline[329]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 06 12:59:00 localhost systemd[1]: Finished dracut cmdline hook.
Oct 06 12:59:00 localhost systemd[1]: Starting dracut pre-udev hook...
Oct 06 12:59:00 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct 06 12:59:00 localhost kernel: device-mapper: uevent: version 1.0.3
Oct 06 12:59:00 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct 06 12:59:00 localhost kernel: RPC: Registered named UNIX socket transport module.
Oct 06 12:59:00 localhost kernel: RPC: Registered udp transport module.
Oct 06 12:59:00 localhost kernel: RPC: Registered tcp transport module.
Oct 06 12:59:00 localhost kernel: RPC: Registered tcp-with-tls transport module.
Oct 06 12:59:00 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct 06 12:59:00 localhost rpc.statd[446]: Version 2.5.4 starting
Oct 06 12:59:00 localhost rpc.statd[446]: Initializing NSM state
Oct 06 12:59:00 localhost rpc.idmapd[451]: Setting log level to 0
Oct 06 12:59:00 localhost systemd[1]: Finished dracut pre-udev hook.
Oct 06 12:59:00 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 06 12:59:00 localhost systemd-udevd[464]: Using default interface naming scheme 'rhel-9.0'.
Oct 06 12:59:00 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 06 12:59:00 localhost systemd[1]: Starting dracut pre-trigger hook...
Oct 06 12:59:01 localhost systemd[1]: Finished dracut pre-trigger hook.
Oct 06 12:59:01 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 06 12:59:01 localhost systemd[1]: Created slice Slice /system/modprobe.
Oct 06 12:59:01 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 06 12:59:01 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 06 12:59:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 06 12:59:01 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 06 12:59:01 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 06 12:59:01 localhost systemd[1]: Reached target Network.
Oct 06 12:59:01 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 06 12:59:01 localhost systemd[1]: Starting dracut initqueue hook...
Oct 06 12:59:01 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct 06 12:59:01 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct 06 12:59:01 localhost kernel:  vda: vda1
Oct 06 12:59:01 localhost systemd-udevd[466]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 12:59:01 localhost kernel: libata version 3.00 loaded.
Oct 06 12:59:01 localhost systemd[1]: Mounting Kernel Configuration File System...
Oct 06 12:59:01 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Oct 06 12:59:01 localhost systemd[1]: Mounted Kernel Configuration File System.
Oct 06 12:59:01 localhost kernel: scsi host0: ata_piix
Oct 06 12:59:01 localhost kernel: scsi host1: ata_piix
Oct 06 12:59:01 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct 06 12:59:01 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct 06 12:59:01 localhost systemd[1]: Reached target System Initialization.
Oct 06 12:59:01 localhost systemd[1]: Reached target Basic System.
Oct 06 12:59:01 localhost systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct 06 12:59:01 localhost systemd[1]: Reached target Initrd Root Device.
Oct 06 12:59:01 localhost kernel: ata1: found unknown device (class 0)
Oct 06 12:59:01 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct 06 12:59:01 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct 06 12:59:01 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct 06 12:59:01 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct 06 12:59:01 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct 06 12:59:01 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Oct 06 12:59:01 localhost systemd[1]: Finished dracut initqueue hook.
Oct 06 12:59:01 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Oct 06 12:59:01 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Oct 06 12:59:01 localhost systemd[1]: Reached target Remote File Systems.
Oct 06 12:59:01 localhost systemd[1]: Starting dracut pre-mount hook...
Oct 06 12:59:01 localhost systemd[1]: Finished dracut pre-mount hook.
Oct 06 12:59:01 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct 06 12:59:01 localhost systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Oct 06 12:59:01 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct 06 12:59:01 localhost systemd[1]: Mounting /sysroot...
Oct 06 12:59:02 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct 06 12:59:02 localhost kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct 06 12:59:02 localhost kernel: XFS (vda1): Ending clean mount
Oct 06 12:59:02 localhost systemd[1]: Mounted /sysroot.
Oct 06 12:59:02 localhost systemd[1]: Reached target Initrd Root File System.
Oct 06 12:59:02 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct 06 12:59:02 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct 06 12:59:02 localhost systemd[1]: Reached target Initrd File Systems.
Oct 06 12:59:02 localhost systemd[1]: Reached target Initrd Default Target.
Oct 06 12:59:02 localhost systemd[1]: Starting dracut mount hook...
Oct 06 12:59:02 localhost systemd[1]: Finished dracut mount hook.
Oct 06 12:59:02 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct 06 12:59:02 localhost rpc.idmapd[451]: exiting on signal 15
Oct 06 12:59:02 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct 06 12:59:02 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct 06 12:59:02 localhost systemd[1]: Stopped target Network.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Timer Units.
Oct 06 12:59:02 localhost systemd[1]: dbus.socket: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Oct 06 12:59:02 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Initrd Default Target.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Basic System.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Initrd Root Device.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Initrd /usr File System.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Path Units.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Remote File Systems.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Slice Units.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Socket Units.
Oct 06 12:59:02 localhost systemd[1]: Stopped target System Initialization.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Local File Systems.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Swaps.
Oct 06 12:59:02 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped dracut mount hook.
Oct 06 12:59:02 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped dracut pre-mount hook.
Oct 06 12:59:02 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Oct 06 12:59:02 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct 06 12:59:02 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped dracut initqueue hook.
Oct 06 12:59:02 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped Apply Kernel Variables.
Oct 06 12:59:02 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Oct 06 12:59:02 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped Coldplug All udev Devices.
Oct 06 12:59:02 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped dracut pre-trigger hook.
Oct 06 12:59:02 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct 06 12:59:02 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped Setup Virtual Console.
Oct 06 12:59:02 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct 06 12:59:02 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct 06 12:59:02 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Closed udev Control Socket.
Oct 06 12:59:02 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Closed udev Kernel Socket.
Oct 06 12:59:02 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped dracut pre-udev hook.
Oct 06 12:59:02 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped dracut cmdline hook.
Oct 06 12:59:02 localhost systemd[1]: Starting Cleanup udev Database...
Oct 06 12:59:02 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct 06 12:59:02 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Oct 06 12:59:02 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Stopped Create System Users.
Oct 06 12:59:02 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct 06 12:59:02 localhost systemd[1]: Finished Cleanup udev Database.
Oct 06 12:59:02 localhost systemd[1]: Reached target Switch Root.
Oct 06 12:59:02 localhost systemd[1]: Starting Switch Root...
Oct 06 12:59:02 localhost systemd[1]: Switching root.
Oct 06 12:59:02 localhost systemd-journald[309]: Journal stopped
Oct 06 12:59:03 localhost systemd-journald[309]: Received SIGTERM from PID 1 (systemd).
Oct 06 12:59:03 localhost kernel: audit: type=1404 audit(1759755542.955:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct 06 12:59:03 localhost kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 12:59:03 localhost kernel: SELinux:  policy capability open_perms=1
Oct 06 12:59:03 localhost kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 12:59:03 localhost kernel: SELinux:  policy capability always_check_network=0
Oct 06 12:59:03 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 12:59:03 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 12:59:03 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 12:59:03 localhost kernel: audit: type=1403 audit(1759755543.106:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct 06 12:59:03 localhost systemd[1]: Successfully loaded SELinux policy in 154.864ms.
Oct 06 12:59:03 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 32.456ms.
Oct 06 12:59:03 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 06 12:59:03 localhost systemd[1]: Detected virtualization kvm.
Oct 06 12:59:03 localhost systemd[1]: Detected architecture x86-64.
Oct 06 12:59:03 localhost systemd-rc-local-generator[637]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 12:59:03 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Oct 06 12:59:03 localhost systemd[1]: Stopped Switch Root.
Oct 06 12:59:03 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct 06 12:59:03 localhost systemd[1]: Created slice Slice /system/getty.
Oct 06 12:59:03 localhost systemd[1]: Created slice Slice /system/serial-getty.
Oct 06 12:59:03 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Oct 06 12:59:03 localhost systemd[1]: Created slice User and Session Slice.
Oct 06 12:59:03 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 06 12:59:03 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Oct 06 12:59:03 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct 06 12:59:03 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 06 12:59:03 localhost systemd[1]: Stopped target Switch Root.
Oct 06 12:59:03 localhost systemd[1]: Stopped target Initrd File Systems.
Oct 06 12:59:03 localhost systemd[1]: Stopped target Initrd Root File System.
Oct 06 12:59:03 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Oct 06 12:59:03 localhost systemd[1]: Reached target Path Units.
Oct 06 12:59:03 localhost systemd[1]: Reached target rpc_pipefs.target.
Oct 06 12:59:03 localhost systemd[1]: Reached target Slice Units.
Oct 06 12:59:03 localhost systemd[1]: Reached target Swaps.
Oct 06 12:59:03 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Oct 06 12:59:03 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Oct 06 12:59:03 localhost systemd[1]: Reached target RPC Port Mapper.
Oct 06 12:59:03 localhost systemd[1]: Listening on Process Core Dump Socket.
Oct 06 12:59:03 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Oct 06 12:59:03 localhost systemd[1]: Listening on udev Control Socket.
Oct 06 12:59:03 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 06 12:59:03 localhost systemd[1]: Mounting Huge Pages File System...
Oct 06 12:59:03 localhost systemd[1]: Mounting POSIX Message Queue File System...
Oct 06 12:59:03 localhost systemd[1]: Mounting Kernel Debug File System...
Oct 06 12:59:03 localhost systemd[1]: Mounting Kernel Trace File System...
Oct 06 12:59:03 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 06 12:59:03 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 06 12:59:03 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 06 12:59:03 localhost systemd[1]: Starting Load Kernel Module drm...
Oct 06 12:59:03 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Oct 06 12:59:03 localhost systemd[1]: Starting Load Kernel Module fuse...
Oct 06 12:59:03 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct 06 12:59:03 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Oct 06 12:59:03 localhost systemd[1]: Stopped File System Check on Root Device.
Oct 06 12:59:03 localhost systemd[1]: Stopped Journal Service.
Oct 06 12:59:03 localhost systemd[1]: Starting Journal Service...
Oct 06 12:59:03 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 06 12:59:03 localhost systemd[1]: Starting Generate network units from Kernel command line...
Oct 06 12:59:03 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 06 12:59:03 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Oct 06 12:59:03 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct 06 12:59:03 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 06 12:59:03 localhost kernel: fuse: init (API version 7.37)
Oct 06 12:59:03 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 06 12:59:03 localhost systemd[1]: Mounted Huge Pages File System.
Oct 06 12:59:03 localhost systemd[1]: Mounted POSIX Message Queue File System.
Oct 06 12:59:03 localhost systemd-journald[678]: Journal started
Oct 06 12:59:03 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct 06 12:59:03 localhost systemd[1]: Queued start job for default target Multi-User System.
Oct 06 12:59:03 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Oct 06 12:59:03 localhost systemd[1]: Started Journal Service.
Oct 06 12:59:03 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct 06 12:59:03 localhost systemd[1]: Mounted Kernel Debug File System.
Oct 06 12:59:03 localhost systemd[1]: Mounted Kernel Trace File System.
Oct 06 12:59:03 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 06 12:59:03 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 06 12:59:03 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 06 12:59:03 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct 06 12:59:03 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Oct 06 12:59:03 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct 06 12:59:03 localhost systemd[1]: Finished Load Kernel Module fuse.
Oct 06 12:59:03 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct 06 12:59:03 localhost kernel: ACPI: bus type drm_connector registered
Oct 06 12:59:03 localhost systemd[1]: Finished Generate network units from Kernel command line.
Oct 06 12:59:03 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct 06 12:59:03 localhost systemd[1]: Finished Load Kernel Module drm.
Oct 06 12:59:03 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Oct 06 12:59:03 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 06 12:59:03 localhost systemd[1]: Mounting FUSE Control File System...
Oct 06 12:59:03 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 06 12:59:03 localhost systemd[1]: Starting Rebuild Hardware Database...
Oct 06 12:59:03 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Oct 06 12:59:03 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct 06 12:59:03 localhost systemd[1]: Starting Load/Save OS Random Seed...
Oct 06 12:59:03 localhost systemd[1]: Starting Create System Users...
Oct 06 12:59:03 localhost systemd[1]: Mounted FUSE Control File System.
Oct 06 12:59:03 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct 06 12:59:03 localhost systemd-journald[678]: Received client request to flush runtime journal.
Oct 06 12:59:03 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Oct 06 12:59:03 localhost systemd[1]: Finished Load/Save OS Random Seed.
Oct 06 12:59:03 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 06 12:59:03 localhost systemd[1]: Finished Create System Users.
Oct 06 12:59:03 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 06 12:59:03 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 06 12:59:03 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 06 12:59:03 localhost systemd[1]: Reached target Preparation for Local File Systems.
Oct 06 12:59:03 localhost systemd[1]: Reached target Local File Systems.
Oct 06 12:59:03 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct 06 12:59:03 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct 06 12:59:03 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct 06 12:59:03 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct 06 12:59:03 localhost systemd[1]: Starting Automatic Boot Loader Update...
Oct 06 12:59:03 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct 06 12:59:03 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 06 12:59:03 localhost bootctl[696]: Couldn't find EFI system partition, skipping.
Oct 06 12:59:03 localhost systemd[1]: Finished Automatic Boot Loader Update.
Oct 06 12:59:03 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 06 12:59:03 localhost systemd[1]: Starting Security Auditing Service...
Oct 06 12:59:04 localhost systemd[1]: Starting RPC Bind...
Oct 06 12:59:04 localhost systemd[1]: Starting Rebuild Journal Catalog...
Oct 06 12:59:04 localhost auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct 06 12:59:04 localhost auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct 06 12:59:04 localhost systemd[1]: Finished Rebuild Journal Catalog.
Oct 06 12:59:04 localhost augenrules[707]: /sbin/augenrules: No change
Oct 06 12:59:04 localhost systemd[1]: Started RPC Bind.
Oct 06 12:59:04 localhost augenrules[722]: No rules
Oct 06 12:59:04 localhost augenrules[722]: enabled 1
Oct 06 12:59:04 localhost augenrules[722]: failure 1
Oct 06 12:59:04 localhost augenrules[722]: pid 702
Oct 06 12:59:04 localhost augenrules[722]: rate_limit 0
Oct 06 12:59:04 localhost augenrules[722]: backlog_limit 8192
Oct 06 12:59:04 localhost augenrules[722]: lost 0
Oct 06 12:59:04 localhost augenrules[722]: backlog 0
Oct 06 12:59:04 localhost augenrules[722]: backlog_wait_time 60000
Oct 06 12:59:04 localhost augenrules[722]: backlog_wait_time_actual 0
Oct 06 12:59:04 localhost augenrules[722]: enabled 1
Oct 06 12:59:04 localhost augenrules[722]: failure 1
Oct 06 12:59:04 localhost augenrules[722]: pid 702
Oct 06 12:59:04 localhost augenrules[722]: rate_limit 0
Oct 06 12:59:04 localhost augenrules[722]: backlog_limit 8192
Oct 06 12:59:04 localhost augenrules[722]: lost 0
Oct 06 12:59:04 localhost augenrules[722]: backlog 4
Oct 06 12:59:04 localhost augenrules[722]: backlog_wait_time 60000
Oct 06 12:59:04 localhost augenrules[722]: backlog_wait_time_actual 0
Oct 06 12:59:04 localhost augenrules[722]: enabled 1
Oct 06 12:59:04 localhost augenrules[722]: failure 1
Oct 06 12:59:04 localhost augenrules[722]: pid 702
Oct 06 12:59:04 localhost augenrules[722]: rate_limit 0
Oct 06 12:59:04 localhost augenrules[722]: backlog_limit 8192
Oct 06 12:59:04 localhost augenrules[722]: lost 0
Oct 06 12:59:04 localhost augenrules[722]: backlog 4
Oct 06 12:59:04 localhost augenrules[722]: backlog_wait_time 60000
Oct 06 12:59:04 localhost augenrules[722]: backlog_wait_time_actual 0
Oct 06 12:59:04 localhost systemd[1]: Started Security Auditing Service.
Oct 06 12:59:04 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct 06 12:59:04 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct 06 12:59:04 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct 06 12:59:04 localhost systemd[1]: Finished Rebuild Hardware Database.
Oct 06 12:59:04 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 06 12:59:04 localhost systemd[1]: Starting Update is Completed...
Oct 06 12:59:04 localhost systemd[1]: Finished Update is Completed.
Oct 06 12:59:04 localhost systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Oct 06 12:59:04 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 06 12:59:04 localhost systemd[1]: Reached target System Initialization.
Oct 06 12:59:04 localhost systemd[1]: Started dnf makecache --timer.
Oct 06 12:59:04 localhost systemd[1]: Started Daily rotation of log files.
Oct 06 12:59:04 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct 06 12:59:04 localhost systemd[1]: Reached target Timer Units.
Oct 06 12:59:04 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 06 12:59:04 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct 06 12:59:04 localhost systemd[1]: Reached target Socket Units.
Oct 06 12:59:04 localhost systemd[1]: Starting D-Bus System Message Bus...
Oct 06 12:59:04 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 06 12:59:04 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 06 12:59:04 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct 06 12:59:04 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 06 12:59:04 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 06 12:59:04 localhost systemd-udevd[743]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 12:59:04 localhost systemd[1]: Started D-Bus System Message Bus.
Oct 06 12:59:04 localhost systemd[1]: Reached target Basic System.
Oct 06 12:59:04 localhost dbus-broker-lau[742]: Ready
Oct 06 12:59:04 localhost systemd[1]: Starting NTP client/server...
Oct 06 12:59:04 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct 06 12:59:04 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct 06 12:59:04 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct 06 12:59:04 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct 06 12:59:04 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct 06 12:59:04 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct 06 12:59:04 localhost systemd[1]: Starting IPv4 firewall with iptables...
Oct 06 12:59:04 localhost systemd[1]: Started irqbalance daemon.
Oct 06 12:59:04 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct 06 12:59:04 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 06 12:59:04 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 06 12:59:04 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 06 12:59:04 localhost systemd[1]: Reached target sshd-keygen.target.
Oct 06 12:59:04 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct 06 12:59:04 localhost systemd[1]: Reached target User and Group Name Lookups.
Oct 06 12:59:04 localhost chronyd[787]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 06 12:59:04 localhost chronyd[787]: Loaded 0 symmetric keys
Oct 06 12:59:04 localhost chronyd[787]: Using right/UTC timezone to obtain leap second data
Oct 06 12:59:04 localhost chronyd[787]: Loaded seccomp filter (level 2)
Oct 06 12:59:04 localhost systemd[1]: Starting User Login Management...
Oct 06 12:59:04 localhost systemd[1]: Started NTP client/server.
Oct 06 12:59:04 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct 06 12:59:04 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct 06 12:59:04 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct 06 12:59:04 localhost systemd-logind[789]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 06 12:59:04 localhost systemd-logind[789]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 06 12:59:04 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct 06 12:59:04 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct 06 12:59:04 localhost kernel: kvm_amd: TSC scaling supported
Oct 06 12:59:04 localhost kernel: kvm_amd: Nested Virtualization enabled
Oct 06 12:59:04 localhost kernel: kvm_amd: Nested Paging enabled
Oct 06 12:59:04 localhost kernel: kvm_amd: LBR virtualization supported
Oct 06 12:59:04 localhost systemd-logind[789]: New seat seat0.
Oct 06 12:59:04 localhost systemd[1]: Started User Login Management.
Oct 06 12:59:04 localhost kernel: Console: switching to colour dummy device 80x25
Oct 06 12:59:04 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct 06 12:59:04 localhost kernel: [drm] features: -context_init
Oct 06 12:59:04 localhost kernel: [drm] number of scanouts: 1
Oct 06 12:59:04 localhost kernel: [drm] number of cap sets: 0
Oct 06 12:59:04 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct 06 12:59:04 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct 06 12:59:04 localhost kernel: Console: switching to colour frame buffer device 128x48
Oct 06 12:59:04 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct 06 12:59:04 localhost iptables.init[778]: iptables: Applying firewall rules: [  OK  ]
Oct 06 12:59:04 localhost systemd[1]: Finished IPv4 firewall with iptables.
Oct 06 12:59:05 localhost cloud-init[839]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 06 Oct 2025 12:59:05 +0000. Up 7.04 seconds.
Oct 06 12:59:05 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Oct 06 12:59:05 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Oct 06 12:59:05 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp0pn25ur3.mount: Deactivated successfully.
Oct 06 12:59:05 localhost systemd[1]: Starting Hostname Service...
Oct 06 12:59:05 localhost systemd[1]: Started Hostname Service.
Oct 06 12:59:05 np0005472061.novalocal systemd-hostnamed[853]: Hostname set to <np0005472061.novalocal> (static)
Oct 06 12:59:05 np0005472061.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct 06 12:59:05 np0005472061.novalocal systemd[1]: Reached target Preparation for Network.
Oct 06 12:59:05 np0005472061.novalocal systemd[1]: Starting Network Manager...
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9314] NetworkManager (version 1.54.1-1.el9) is starting... (boot:58776300-6201-422f-aac2-b277cfa9c8d1)
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9320] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9482] manager[0x55f49961f080]: monitoring kernel firmware directory '/lib/firmware'.
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9523] hostname: hostname: using hostnamed
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9523] hostname: static hostname changed from (none) to "np0005472061.novalocal"
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9527] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9645] manager[0x55f49961f080]: rfkill: Wi-Fi hardware radio set enabled
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9646] manager[0x55f49961f080]: rfkill: WWAN hardware radio set enabled
Oct 06 12:59:05 np0005472061.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9754] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9755] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9756] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9757] manager: Networking is enabled by state file
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9760] settings: Loaded settings plugin: keyfile (internal)
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9901] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9935] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9969] dhcp: init: Using DHCP client 'internal'
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9972] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 06 12:59:05 np0005472061.novalocal NetworkManager[857]: <info>  [1759755545.9988] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0001] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0009] device (lo): Activation: starting connection 'lo' (d21fa287-6890-4cd6-bfdf-64464e1b01d1)
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0019] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0022] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0052] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0056] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0058] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0060] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0062] device (eth0): carrier: link connected
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0065] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0089] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0096] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0100] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0101] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0103] manager: NetworkManager state is now CONNECTING
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0104] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0111] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0114] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Started Network Manager.
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0161] dhcp4 (eth0): state changed new lease, address=38.102.83.150
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Reached target Network.
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0172] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0194] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0348] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0350] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0358] device (lo): Activation: successful, device activated.
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0365] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0370] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0373] manager: NetworkManager state is now CONNECTED_SITE
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0376] device (eth0): Activation: successful, device activated.
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0384] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 06 12:59:06 np0005472061.novalocal NetworkManager[857]: <info>  [1759755546.0385] manager: startup complete
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Reached target NFS client services.
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Reached target Remote File Systems.
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 06 12:59:06 np0005472061.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 06 Oct 2025 12:59:06 +0000. Up 8.07 seconds.
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: |  eth0  | True |        38.102.83.150         | 255.255.255.0 | global | fa:16:3e:45:a6:7d |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe45:a67d/64 |       .       |  link  | fa:16:3e:45:a6:7d |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct 06 12:59:06 np0005472061.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 06 12:59:07 np0005472061.novalocal useradd[986]: new group: name=cloud-user, GID=1001
Oct 06 12:59:07 np0005472061.novalocal useradd[986]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Oct 06 12:59:07 np0005472061.novalocal useradd[986]: add 'cloud-user' to group 'adm'
Oct 06 12:59:07 np0005472061.novalocal useradd[986]: add 'cloud-user' to group 'systemd-journal'
Oct 06 12:59:07 np0005472061.novalocal useradd[986]: add 'cloud-user' to shadow group 'adm'
Oct 06 12:59:07 np0005472061.novalocal useradd[986]: add 'cloud-user' to shadow group 'systemd-journal'
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: Generating public/private rsa key pair.
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: The key fingerprint is:
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: SHA256:YdN0eTS9HQOM1hi4Kkeln0KiUPt+4CSC4LgFYozSWSI root@np0005472061.novalocal
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: The key's randomart image is:
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: +---[RSA 3072]----+
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |          o.B++. |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |E ...    = =.o.+.|
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |oo.+.   * +  .  =|
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |*+o. . = +     ..|
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |O.. o + S .      |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |oo.o = + o       |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: | o. = + .        |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |.    o .         |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |      .          |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: +----[SHA256]-----+
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: Generating public/private ecdsa key pair.
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: The key fingerprint is:
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: SHA256:lZbilpMugV7XcqZ3kYqSU3Lo4FSSdOlM7AYhCKUWFWk root@np0005472061.novalocal
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: The key's randomart image is:
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: +---[ECDSA 256]---+
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |o++o=o..         |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |...E.o+    o     |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |... o*. . =      |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |.    +=o B   .   |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |    +.= S + o    |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |   + + X B . .   |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |    o * + o .    |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |       + . .     |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |                 |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: +----[SHA256]-----+
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: Generating public/private ed25519 key pair.
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: The key fingerprint is:
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: SHA256:OlooNv6FV067t2EOHlE+4s7WsvqEeME9kCjm3rMDy3U root@np0005472061.novalocal
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: The key's randomart image is:
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: +--[ED25519 256]--+
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |                 |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |     . .         |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |  o . o   .      |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: | o . . o o       |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |  .   o S o      |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: | ....+EO = .     |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: | .=+*.O B.o      |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: | ooo.O *o*o.     |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: |  ..+..+B+o.     |
Oct 06 12:59:07 np0005472061.novalocal cloud-init[920]: +----[SHA256]-----+
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Reached target Cloud-config availability.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Reached target Network is Online.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Starting System Logging Service...
Oct 06 12:59:07 np0005472061.novalocal sm-notify[1001]: Version 2.5.4 starting
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Starting OpenSSH server daemon...
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Starting Permit User Sessions...
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Started Notify NFS peers of a restart.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Finished Permit User Sessions.
Oct 06 12:59:07 np0005472061.novalocal sshd[1003]: Server listening on 0.0.0.0 port 22.
Oct 06 12:59:07 np0005472061.novalocal sshd[1003]: Server listening on :: port 22.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Started OpenSSH server daemon.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Started Command Scheduler.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Started Getty on tty1.
Oct 06 12:59:07 np0005472061.novalocal crond[1005]: (CRON) STARTUP (1.5.7)
Oct 06 12:59:07 np0005472061.novalocal crond[1005]: (CRON) INFO (Syslog will be used instead of sendmail.)
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Started Serial Getty on ttyS0.
Oct 06 12:59:07 np0005472061.novalocal crond[1005]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 19% if used.)
Oct 06 12:59:07 np0005472061.novalocal crond[1005]: (CRON) INFO (running with inotify support)
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Reached target Login Prompts.
Oct 06 12:59:07 np0005472061.novalocal rsyslogd[1002]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1002" x-info="https://www.rsyslog.com"] start
Oct 06 12:59:07 np0005472061.novalocal rsyslogd[1002]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Started System Logging Service.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Reached target Multi-User System.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct 06 12:59:07 np0005472061.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Oct 06 12:59:07 np0005472061.novalocal rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1014]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 06 Oct 2025 12:59:08 +0000. Up 9.74 seconds.
Oct 06 12:59:08 np0005472061.novalocal sshd-session[1015]: Connection reset by 38.102.83.114 port 41598 [preauth]
Oct 06 12:59:08 np0005472061.novalocal sshd-session[1017]: Unable to negotiate with 38.102.83.114 port 41610: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Oct 06 12:59:08 np0005472061.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Oct 06 12:59:08 np0005472061.novalocal sshd-session[1019]: Connection closed by 38.102.83.114 port 41612 [preauth]
Oct 06 12:59:08 np0005472061.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Oct 06 12:59:08 np0005472061.novalocal sshd-session[1022]: Unable to negotiate with 38.102.83.114 port 41624: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Oct 06 12:59:08 np0005472061.novalocal sshd-session[1024]: Unable to negotiate with 38.102.83.114 port 41634: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Oct 06 12:59:08 np0005472061.novalocal sshd-session[1030]: Unable to negotiate with 38.102.83.114 port 41658: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Oct 06 12:59:08 np0005472061.novalocal sshd-session[1032]: Unable to negotiate with 38.102.83.114 port 41664: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Oct 06 12:59:08 np0005472061.novalocal sshd-session[1026]: Connection closed by 38.102.83.114 port 41640 [preauth]
Oct 06 12:59:08 np0005472061.novalocal sshd-session[1028]: Connection closed by 38.102.83.114 port 41642 [preauth]
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1036]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 06 Oct 2025 12:59:08 +0000. Up 10.11 seconds.
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1038]: #############################################################
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1039]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1041]: 256 SHA256:lZbilpMugV7XcqZ3kYqSU3Lo4FSSdOlM7AYhCKUWFWk root@np0005472061.novalocal (ECDSA)
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1043]: 256 SHA256:OlooNv6FV067t2EOHlE+4s7WsvqEeME9kCjm3rMDy3U root@np0005472061.novalocal (ED25519)
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1045]: 3072 SHA256:YdN0eTS9HQOM1hi4Kkeln0KiUPt+4CSC4LgFYozSWSI root@np0005472061.novalocal (RSA)
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1046]: -----END SSH HOST KEY FINGERPRINTS-----
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1047]: #############################################################
Oct 06 12:59:08 np0005472061.novalocal cloud-init[1036]: Cloud-init v. 24.4-7.el9 finished at Mon, 06 Oct 2025 12:59:08 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.28 seconds
Oct 06 12:59:08 np0005472061.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Oct 06 12:59:08 np0005472061.novalocal systemd[1]: Reached target Cloud-init target.
Oct 06 12:59:08 np0005472061.novalocal systemd[1]: Startup finished in 1.686s (kernel) + 2.932s (initrd) + 5.752s (userspace) = 10.371s.
Oct 06 12:59:11 np0005472061.novalocal chronyd[787]: Selected source 198.181.199.86 (2.centos.pool.ntp.org)
Oct 06 12:59:11 np0005472061.novalocal chronyd[787]: System clock TAI offset set to 37 seconds
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: Cannot change IRQ 25 affinity: Operation not permitted
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: IRQ 25 affinity is now unmanaged
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: Cannot change IRQ 31 affinity: Operation not permitted
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: IRQ 31 affinity is now unmanaged
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: Cannot change IRQ 28 affinity: Operation not permitted
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: IRQ 28 affinity is now unmanaged
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: Cannot change IRQ 32 affinity: Operation not permitted
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: IRQ 32 affinity is now unmanaged
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: Cannot change IRQ 30 affinity: Operation not permitted
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: IRQ 30 affinity is now unmanaged
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: Cannot change IRQ 29 affinity: Operation not permitted
Oct 06 12:59:15 np0005472061.novalocal irqbalance[782]: IRQ 29 affinity is now unmanaged
Oct 06 12:59:16 np0005472061.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 06 12:59:23 np0005472061.novalocal sshd-session[1052]: banner exchange: Connection from 195.178.110.15 port 56842: invalid format
Oct 06 12:59:23 np0005472061.novalocal sshd-session[1053]: banner exchange: Connection from 195.178.110.15 port 56854: invalid format
Oct 06 12:59:32 np0005472061.novalocal sshd-session[1054]: Accepted publickey for zuul from 38.102.83.114 port 60786 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Oct 06 12:59:32 np0005472061.novalocal systemd[1]: Created slice User Slice of UID 1000.
Oct 06 12:59:32 np0005472061.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 06 12:59:32 np0005472061.novalocal systemd-logind[789]: New session 1 of user zuul.
Oct 06 12:59:32 np0005472061.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 06 12:59:32 np0005472061.novalocal systemd[1]: Starting User Manager for UID 1000...
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Queued start job for default target Main User Target.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Created slice User Application Slice.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Started Daily Cleanup of User's Temporary Directories.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Reached target Paths.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Reached target Timers.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Starting D-Bus User Message Bus Socket...
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Starting Create User's Volatile Files and Directories...
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Finished Create User's Volatile Files and Directories.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Listening on D-Bus User Message Bus Socket.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Reached target Sockets.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Reached target Basic System.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Reached target Main User Target.
Oct 06 12:59:32 np0005472061.novalocal systemd[1058]: Startup finished in 154ms.
Oct 06 12:59:32 np0005472061.novalocal systemd[1]: Started User Manager for UID 1000.
Oct 06 12:59:32 np0005472061.novalocal systemd[1]: Started Session 1 of User zuul.
Oct 06 12:59:32 np0005472061.novalocal sshd-session[1054]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 12:59:33 np0005472061.novalocal python3[1140]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 12:59:35 np0005472061.novalocal python3[1168]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 12:59:35 np0005472061.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 06 12:59:42 np0005472061.novalocal python3[1228]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 12:59:43 np0005472061.novalocal python3[1268]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct 06 12:59:45 np0005472061.novalocal python3[1294]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCmaNuHdBunv0yvXLZl8yerio6bpjTgfWAlxDLT6mmgaSaKnmY5E44+fx/cADD4iriQyCqkum+IpW02oSrrGmUnQIq0A+f7SPiKChsFKO3PVBbsTRNsE+2vopWAL9MzOAjTyYmMwhR3BynR1RNxxuSwR31xFY0D487JCpG4HW+wj+WkTQJs5c55VPgIEgCpcT0R8i7eHhW8/mCyPH9HDTgV9YC2rmfKl6ZmOlqaRJMfDK8EPVivnS/gZKTNgd78m1aqYdwEkn+WPt7z9BOO1rMElN9cklj8VNS1kjiQvaU2+q9tpY5onmJYGgCsuYn0lahwu3Ltty5/dJIkQVOrM4I1vxG3GCpp3ldUn2zHCwFqZ9G8rqEdw51ueHpAl00F+FVySC89JZ20f+YJYaQ+vPawmhDdMzz0PjggSm92iwDCgyu09gc3bNu1cNydXjdl5D7dybFi0GtfPNHiVjcINyoZlx9YTFMTS66ZcJ7JIn/bG3OJH3TrRV4VOVlCOkP28HU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 12:59:46 np0005472061.novalocal python3[1318]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:46 np0005472061.novalocal python3[1417]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 12:59:47 np0005472061.novalocal python3[1488]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759755586.3727028-229-86649999601693/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=11b6fe18bf0e4050ab1daa4783c3bc00_id_rsa follow=False checksum=f3227caf0dd541b615ec841689bd0783798beeb4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:47 np0005472061.novalocal python3[1611]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 12:59:48 np0005472061.novalocal python3[1682]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759755587.321041-273-45210085260990/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=11b6fe18bf0e4050ab1daa4783c3bc00_id_rsa.pub follow=False checksum=094ecdb94df8584a73affede60aa277a3d83f3cd backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:49 np0005472061.novalocal python3[1730]: ansible-ping Invoked with data=pong
Oct 06 12:59:50 np0005472061.novalocal python3[1754]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 12:59:52 np0005472061.novalocal python3[1812]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct 06 12:59:53 np0005472061.novalocal python3[1844]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:53 np0005472061.novalocal python3[1868]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:53 np0005472061.novalocal python3[1892]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:54 np0005472061.novalocal python3[1916]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:54 np0005472061.novalocal python3[1940]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:54 np0005472061.novalocal python3[1964]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:56 np0005472061.novalocal sudo[1988]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sejekxyhzgntdncijntrtnxjfwbtcakb ; /usr/bin/python3'
Oct 06 12:59:56 np0005472061.novalocal sudo[1988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 12:59:56 np0005472061.novalocal python3[1990]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:56 np0005472061.novalocal sudo[1988]: pam_unix(sudo:session): session closed for user root
Oct 06 12:59:56 np0005472061.novalocal sudo[2066]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myygnjnjpecvfilbihirbmelgckdkvjp ; /usr/bin/python3'
Oct 06 12:59:56 np0005472061.novalocal sudo[2066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 12:59:57 np0005472061.novalocal python3[2068]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 12:59:57 np0005472061.novalocal sudo[2066]: pam_unix(sudo:session): session closed for user root
Oct 06 12:59:57 np0005472061.novalocal sudo[2139]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtbsygofdjkhrwhhxbnmuermqumquhzm ; /usr/bin/python3'
Oct 06 12:59:57 np0005472061.novalocal sudo[2139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 12:59:57 np0005472061.novalocal python3[2141]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759755596.659477-26-158670796547068/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 12:59:57 np0005472061.novalocal sudo[2139]: pam_unix(sudo:session): session closed for user root
Oct 06 12:59:58 np0005472061.novalocal python3[2189]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 12:59:58 np0005472061.novalocal python3[2213]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 12:59:58 np0005472061.novalocal python3[2237]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 12:59:59 np0005472061.novalocal python3[2261]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 12:59:59 np0005472061.novalocal python3[2285]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 12:59:59 np0005472061.novalocal python3[2309]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 12:59:59 np0005472061.novalocal python3[2333]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:00 np0005472061.novalocal python3[2357]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:00 np0005472061.novalocal python3[2381]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:00 np0005472061.novalocal python3[2405]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:01 np0005472061.novalocal python3[2429]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:01 np0005472061.novalocal python3[2453]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:01 np0005472061.novalocal python3[2477]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:01 np0005472061.novalocal python3[2501]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:02 np0005472061.novalocal python3[2525]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:02 np0005472061.novalocal python3[2549]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:02 np0005472061.novalocal python3[2573]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:03 np0005472061.novalocal python3[2597]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:03 np0005472061.novalocal python3[2621]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:03 np0005472061.novalocal python3[2645]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:03 np0005472061.novalocal python3[2669]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:04 np0005472061.novalocal python3[2693]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:04 np0005472061.novalocal python3[2717]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:04 np0005472061.novalocal python3[2741]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:04 np0005472061.novalocal python3[2765]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:05 np0005472061.novalocal python3[2789]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:00:07 np0005472061.novalocal sudo[2813]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auqwsiredauymhhkazlocfsspgmudhui ; /usr/bin/python3'
Oct 06 13:00:07 np0005472061.novalocal sudo[2813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:00:07 np0005472061.novalocal python3[2815]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 06 13:00:07 np0005472061.novalocal systemd[1]: Starting Time & Date Service...
Oct 06 13:00:07 np0005472061.novalocal systemd[1]: Started Time & Date Service.
Oct 06 13:00:07 np0005472061.novalocal systemd-timedated[2817]: Changed time zone to 'UTC' (UTC).
Oct 06 13:00:07 np0005472061.novalocal sudo[2813]: pam_unix(sudo:session): session closed for user root
Oct 06 13:00:08 np0005472061.novalocal sudo[2844]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgffuekstvrxdqdxrbitirndgjysaiab ; /usr/bin/python3'
Oct 06 13:00:08 np0005472061.novalocal sudo[2844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:00:08 np0005472061.novalocal python3[2846]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:00:08 np0005472061.novalocal sudo[2844]: pam_unix(sudo:session): session closed for user root
Oct 06 13:00:08 np0005472061.novalocal python3[2922]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:00:09 np0005472061.novalocal python3[2993]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759755608.3820944-202-2305090052765/source _original_basename=tmp1wiouexh follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:00:09 np0005472061.novalocal python3[3093]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:00:09 np0005472061.novalocal python3[3164]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759755609.1831832-242-112491959728554/source _original_basename=tmpbj_2ue6w follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:00:10 np0005472061.novalocal sudo[3264]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jboowfwmetvkooxzesghgagdmpokpqqr ; /usr/bin/python3'
Oct 06 13:00:10 np0005472061.novalocal sudo[3264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:00:10 np0005472061.novalocal python3[3266]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:00:10 np0005472061.novalocal sudo[3264]: pam_unix(sudo:session): session closed for user root
Oct 06 13:00:10 np0005472061.novalocal sudo[3337]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghzacgmuudgnoctogwqvwonlnxfeehck ; /usr/bin/python3'
Oct 06 13:00:10 np0005472061.novalocal sudo[3337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:00:11 np0005472061.novalocal python3[3339]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759755610.2366612-306-277635265280405/source _original_basename=tmp8hokj21p follow=False checksum=12efaaf67f4d002c9317067f1840bb831c38c306 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:00:11 np0005472061.novalocal sudo[3337]: pam_unix(sudo:session): session closed for user root
Oct 06 13:00:11 np0005472061.novalocal python3[3387]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:00:11 np0005472061.novalocal python3[3413]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:00:12 np0005472061.novalocal sudo[3491]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxnuexfakqpehszhkjfvwzphitgaxrxq ; /usr/bin/python3'
Oct 06 13:00:12 np0005472061.novalocal sudo[3491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:00:12 np0005472061.novalocal python3[3493]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:00:12 np0005472061.novalocal sudo[3491]: pam_unix(sudo:session): session closed for user root
Oct 06 13:00:12 np0005472061.novalocal sudo[3564]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqasgvjdhlxnpckofakurtagwgvtwire ; /usr/bin/python3'
Oct 06 13:00:12 np0005472061.novalocal sudo[3564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:00:12 np0005472061.novalocal python3[3566]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759755611.9724748-362-182491106523520/source _original_basename=tmpjxk28x5r follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:00:12 np0005472061.novalocal sudo[3564]: pam_unix(sudo:session): session closed for user root
Oct 06 13:00:12 np0005472061.novalocal sudo[3615]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kijmtafnjnxkfmttdccrvkuepycalfwr ; /usr/bin/python3'
Oct 06 13:00:12 np0005472061.novalocal sudo[3615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:00:13 np0005472061.novalocal python3[3617]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-1ff9-8d74-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:00:13 np0005472061.novalocal sudo[3615]: pam_unix(sudo:session): session closed for user root
Oct 06 13:00:13 np0005472061.novalocal python3[3645]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-1ff9-8d74-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct 06 13:00:15 np0005472061.novalocal python3[3673]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:00:31 np0005472061.novalocal sudo[3697]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uairrsqeccfsmujtyikstzkqzmicbemj ; /usr/bin/python3'
Oct 06 13:00:31 np0005472061.novalocal sudo[3697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:00:32 np0005472061.novalocal python3[3699]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:00:32 np0005472061.novalocal sudo[3697]: pam_unix(sudo:session): session closed for user root
Oct 06 13:00:37 np0005472061.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 06 13:01:01 np0005472061.novalocal CROND[3703]: (root) CMD (run-parts /etc/cron.hourly)
Oct 06 13:01:01 np0005472061.novalocal run-parts[3706]: (/etc/cron.hourly) starting 0anacron
Oct 06 13:01:01 np0005472061.novalocal anacron[3714]: Anacron started on 2025-10-06
Oct 06 13:01:01 np0005472061.novalocal anacron[3714]: Will run job `cron.daily' in 11 min.
Oct 06 13:01:01 np0005472061.novalocal anacron[3714]: Will run job `cron.weekly' in 31 min.
Oct 06 13:01:01 np0005472061.novalocal anacron[3714]: Will run job `cron.monthly' in 51 min.
Oct 06 13:01:01 np0005472061.novalocal anacron[3714]: Jobs will be executed sequentially
Oct 06 13:01:01 np0005472061.novalocal run-parts[3716]: (/etc/cron.hourly) finished 0anacron
Oct 06 13:01:01 np0005472061.novalocal CROND[3702]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 06 13:01:05 np0005472061.novalocal sshd-session[3717]: Invalid user validator from 45.148.10.240 port 36510
Oct 06 13:01:05 np0005472061.novalocal sshd-session[3717]: Connection closed by invalid user validator 45.148.10.240 port 36510 [preauth]
Oct 06 13:01:13 np0005472061.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 06 13:01:13 np0005472061.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct 06 13:01:13 np0005472061.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct 06 13:01:13 np0005472061.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct 06 13:01:13 np0005472061.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct 06 13:01:13 np0005472061.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct 06 13:01:13 np0005472061.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct 06 13:01:13 np0005472061.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct 06 13:01:13 np0005472061.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct 06 13:01:13 np0005472061.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4603] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 06 13:01:13 np0005472061.novalocal systemd-udevd[3720]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4826] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4859] settings: (eth1): created default wired connection 'Wired connection 1'
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4865] device (eth1): carrier: link connected
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4868] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4875] policy: auto-activating connection 'Wired connection 1' (aa3ec181-3406-36ec-85f4-51c2c84686c7)
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4880] device (eth1): Activation: starting connection 'Wired connection 1' (aa3ec181-3406-36ec-85f4-51c2c84686c7)
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4882] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4886] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4891] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:01:13 np0005472061.novalocal NetworkManager[857]: <info>  [1759755673.4896] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 06 13:01:14 np0005472061.novalocal python3[3746]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-7f68-1324-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:01:20 np0005472061.novalocal sudo[3824]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbnynovpyrdpqbjqzcwktrnbavtwhnrz ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 06 13:01:20 np0005472061.novalocal sudo[3824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:01:21 np0005472061.novalocal python3[3826]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:01:21 np0005472061.novalocal sudo[3824]: pam_unix(sudo:session): session closed for user root
Oct 06 13:01:21 np0005472061.novalocal sudo[3897]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piwdgndzaknmdkqverhdozeijhrfcdzn ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 06 13:01:21 np0005472061.novalocal sudo[3897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:01:21 np0005472061.novalocal python3[3899]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759755680.7852132-103-229881256665069/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=87099df9db5f44e9af239b59564406b9c5601b37 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:01:21 np0005472061.novalocal sudo[3897]: pam_unix(sudo:session): session closed for user root
Oct 06 13:01:21 np0005472061.novalocal sudo[3947]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opbcwwolsmggedzexptakculbpwmdmvm ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 06 13:01:21 np0005472061.novalocal sudo[3947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:01:22 np0005472061.novalocal python3[3949]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Stopped Network Manager Wait Online.
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Stopping Network Manager Wait Online...
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Stopping Network Manager...
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[857]: <info>  [1759755682.2804] caught SIGTERM, shutting down normally.
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[857]: <info>  [1759755682.2816] dhcp4 (eth0): canceled DHCP transaction
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[857]: <info>  [1759755682.2816] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[857]: <info>  [1759755682.2816] dhcp4 (eth0): state changed no lease
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[857]: <info>  [1759755682.2819] manager: NetworkManager state is now CONNECTING
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[857]: <info>  [1759755682.2921] dhcp4 (eth1): canceled DHCP transaction
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[857]: <info>  [1759755682.2921] dhcp4 (eth1): state changed no lease
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[857]: <info>  [1759755682.2978] exiting (success)
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Stopped Network Manager.
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: NetworkManager.service: Consumed 1.024s CPU time, 9.9M memory peak.
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Starting Network Manager...
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.3486] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:58776300-6201-422f-aac2-b277cfa9c8d1)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.3489] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.3556] manager[0x562ff6d78070]: monitoring kernel firmware directory '/lib/firmware'.
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Starting Hostname Service...
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Started Hostname Service.
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4294] hostname: hostname: using hostnamed
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4294] hostname: static hostname changed from (none) to "np0005472061.novalocal"
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4302] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4310] manager[0x562ff6d78070]: rfkill: Wi-Fi hardware radio set enabled
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4310] manager[0x562ff6d78070]: rfkill: WWAN hardware radio set enabled
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4362] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4363] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4364] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4365] manager: Networking is enabled by state file
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4369] settings: Loaded settings plugin: keyfile (internal)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4376] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4425] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4442] dhcp: init: Using DHCP client 'internal'
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4447] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4455] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4466] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4481] device (lo): Activation: starting connection 'lo' (d21fa287-6890-4cd6-bfdf-64464e1b01d1)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4492] device (eth0): carrier: link connected
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4499] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4511] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4512] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4524] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4537] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4549] device (eth1): carrier: link connected
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4559] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4570] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (aa3ec181-3406-36ec-85f4-51c2c84686c7) (indicated)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4571] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4583] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4598] device (eth1): Activation: starting connection 'Wired connection 1' (aa3ec181-3406-36ec-85f4-51c2c84686c7)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4607] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Started Network Manager.
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4614] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4618] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4622] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4626] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4633] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4637] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4642] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4647] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4658] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4663] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4678] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4683] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4713] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4722] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.4737] device (lo): Activation: successful, device activated.
Oct 06 13:01:22 np0005472061.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 06 13:01:22 np0005472061.novalocal sudo[3947]: pam_unix(sudo:session): session closed for user root
Oct 06 13:01:22 np0005472061.novalocal python3[4014]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-7f68-1324-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.8888] dhcp4 (eth0): state changed new lease, address=38.102.83.150
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.8911] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.8999] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.9038] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.9040] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.9044] manager: NetworkManager state is now CONNECTED_SITE
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.9048] device (eth0): Activation: successful, device activated.
Oct 06 13:01:22 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755682.9055] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 06 13:01:32 np0005472061.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 06 13:01:51 np0005472061.novalocal systemd[1058]: Starting Mark boot as successful...
Oct 06 13:01:51 np0005472061.novalocal systemd[1058]: Finished Mark boot as successful.
Oct 06 13:01:52 np0005472061.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.2996] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 06 13:02:07 np0005472061.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 06 13:02:07 np0005472061.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3339] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3342] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3350] device (eth1): Activation: successful, device activated.
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3355] manager: startup complete
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3356] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <warn>  [1759755727.3360] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3368] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct 06 13:02:07 np0005472061.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3490] dhcp4 (eth1): canceled DHCP transaction
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3491] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3491] dhcp4 (eth1): state changed no lease
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3517] policy: auto-activating connection 'ci-private-network' (ee89cfcf-f0a3-5a6c-9105-8e4b6101123c)
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3524] device (eth1): Activation: starting connection 'ci-private-network' (ee89cfcf-f0a3-5a6c-9105-8e4b6101123c)
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3526] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3531] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3542] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3556] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3616] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3619] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:02:07 np0005472061.novalocal NetworkManager[3953]: <info>  [1759755727.3632] device (eth1): Activation: successful, device activated.
Oct 06 13:02:17 np0005472061.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 06 13:02:22 np0005472061.novalocal sshd-session[1067]: Received disconnect from 38.102.83.114 port 60786:11: disconnected by user
Oct 06 13:02:22 np0005472061.novalocal sshd-session[1067]: Disconnected from user zuul 38.102.83.114 port 60786
Oct 06 13:02:22 np0005472061.novalocal sshd-session[1054]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:02:22 np0005472061.novalocal systemd-logind[789]: Session 1 logged out. Waiting for processes to exit.
Oct 06 13:02:48 np0005472061.novalocal sshd-session[4062]: Accepted publickey for zuul from 38.102.83.114 port 57524 ssh2: RSA SHA256:Jx12jaLmKdzqWUaxClrd355NjuSq2gOyPD0e5qs8aYc
Oct 06 13:02:48 np0005472061.novalocal systemd-logind[789]: New session 3 of user zuul.
Oct 06 13:02:48 np0005472061.novalocal systemd[1]: Started Session 3 of User zuul.
Oct 06 13:02:48 np0005472061.novalocal sshd-session[4062]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:02:48 np0005472061.novalocal sudo[4141]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whrjaidmrxopxlsiqocbueawwtxcduhz ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 06 13:02:48 np0005472061.novalocal sudo[4141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:02:48 np0005472061.novalocal python3[4143]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:02:48 np0005472061.novalocal sudo[4141]: pam_unix(sudo:session): session closed for user root
Oct 06 13:02:48 np0005472061.novalocal sudo[4214]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liivncueebyicjgbeusexoyhngchbmyq ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 06 13:02:48 np0005472061.novalocal sudo[4214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:02:49 np0005472061.novalocal python3[4216]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759755768.3615932-312-65877705832210/source _original_basename=tmpvtv3vgfk follow=False checksum=1d41f2ea1f48b551451d74b562620b6f78594815 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:02:49 np0005472061.novalocal sudo[4214]: pam_unix(sudo:session): session closed for user root
Oct 06 13:02:52 np0005472061.novalocal sshd-session[4065]: Connection closed by 38.102.83.114 port 57524
Oct 06 13:02:52 np0005472061.novalocal sshd-session[4062]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:02:52 np0005472061.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Oct 06 13:02:52 np0005472061.novalocal systemd-logind[789]: Session 3 logged out. Waiting for processes to exit.
Oct 06 13:02:52 np0005472061.novalocal systemd-logind[789]: Removed session 3.
Oct 06 13:04:33 np0005472061.novalocal sshd-session[4241]: Connection closed by 59.15.99.151 port 40920
Oct 06 13:04:39 np0005472061.novalocal sshd-session[4242]: Connection closed by authenticating user root 59.15.99.151 port 40988 [preauth]
Oct 06 13:04:42 np0005472061.novalocal sshd-session[4245]: Invalid user admin from 59.15.99.151 port 41494
Oct 06 13:04:42 np0005472061.novalocal sshd-session[4245]: Connection closed by invalid user admin 59.15.99.151 port 41494 [preauth]
Oct 06 13:04:46 np0005472061.novalocal sshd-session[4247]: Invalid user postgres from 59.15.99.151 port 41878
Oct 06 13:04:47 np0005472061.novalocal sshd-session[4247]: Connection closed by invalid user postgres 59.15.99.151 port 41878 [preauth]
Oct 06 13:04:49 np0005472061.novalocal sshd-session[4249]: Invalid user oracle from 59.15.99.151 port 42356
Oct 06 13:04:50 np0005472061.novalocal sshd-session[4249]: Connection closed by invalid user oracle 59.15.99.151 port 42356 [preauth]
Oct 06 13:04:51 np0005472061.novalocal systemd[1058]: Created slice User Background Tasks Slice.
Oct 06 13:04:51 np0005472061.novalocal systemd[1058]: Starting Cleanup of User's Temporary Files and Directories...
Oct 06 13:04:51 np0005472061.novalocal systemd[1058]: Finished Cleanup of User's Temporary Files and Directories.
Oct 06 13:04:53 np0005472061.novalocal sshd-session[4251]: Invalid user steam from 59.15.99.151 port 42696
Oct 06 13:04:54 np0005472061.novalocal sshd-session[4251]: Connection closed by invalid user steam 59.15.99.151 port 42696 [preauth]
Oct 06 13:04:57 np0005472061.novalocal sshd-session[4256]: Invalid user debian from 59.15.99.151 port 43122
Oct 06 13:04:57 np0005472061.novalocal sshd-session[4256]: Connection closed by invalid user debian 59.15.99.151 port 43122 [preauth]
Oct 06 13:05:00 np0005472061.novalocal sshd-session[4258]: Invalid user pgbouncer from 59.15.99.151 port 43484
Oct 06 13:05:01 np0005472061.novalocal sshd-session[4258]: Connection closed by invalid user pgbouncer 59.15.99.151 port 43484 [preauth]
Oct 06 13:05:04 np0005472061.novalocal sshd-session[4260]: Connection closed by authenticating user root 59.15.99.151 port 43888 [preauth]
Oct 06 13:05:07 np0005472061.novalocal sshd-session[4262]: Invalid user postgres from 59.15.99.151 port 44242
Oct 06 13:05:07 np0005472061.novalocal sshd-session[4262]: Connection closed by invalid user postgres 59.15.99.151 port 44242 [preauth]
Oct 06 13:05:10 np0005472061.novalocal sshd-session[4264]: Invalid user devops from 59.15.99.151 port 44648
Oct 06 13:05:11 np0005472061.novalocal sshd-session[4264]: Connection closed by invalid user devops 59.15.99.151 port 44648 [preauth]
Oct 06 13:05:15 np0005472061.novalocal sshd-session[4266]: Invalid user vyos from 59.15.99.151 port 45022
Oct 06 13:05:15 np0005472061.novalocal sshd-session[4266]: Connection closed by invalid user vyos 59.15.99.151 port 45022 [preauth]
Oct 06 13:05:18 np0005472061.novalocal sshd-session[4268]: Invalid user ubuntu from 59.15.99.151 port 45422
Oct 06 13:05:18 np0005472061.novalocal sshd-session[4268]: Connection closed by invalid user ubuntu 59.15.99.151 port 45422 [preauth]
Oct 06 13:05:22 np0005472061.novalocal sshd-session[4270]: Invalid user user from 59.15.99.151 port 45794
Oct 06 13:05:22 np0005472061.novalocal sshd-session[4270]: Connection closed by invalid user user 59.15.99.151 port 45794 [preauth]
Oct 06 13:05:25 np0005472061.novalocal sshd-session[4272]: Connection closed by authenticating user root 59.15.99.151 port 46154 [preauth]
Oct 06 13:05:28 np0005472061.novalocal sshd-session[4274]: Invalid user test from 59.15.99.151 port 46448
Oct 06 13:05:28 np0005472061.novalocal sshd-session[4274]: Connection closed by invalid user test 59.15.99.151 port 46448 [preauth]
Oct 06 13:05:31 np0005472061.novalocal sshd-session[4276]: Invalid user postgres from 59.15.99.151 port 46742
Oct 06 13:05:32 np0005472061.novalocal sshd-session[4276]: Connection closed by invalid user postgres 59.15.99.151 port 46742 [preauth]
Oct 06 13:05:38 np0005472061.novalocal sshd-session[4278]: Connection closed by authenticating user root 59.15.99.151 port 47112 [preauth]
Oct 06 13:05:42 np0005472061.novalocal sshd-session[4280]: Invalid user postgres from 59.15.99.151 port 47592
Oct 06 13:05:43 np0005472061.novalocal sshd-session[4280]: Connection closed by invalid user postgres 59.15.99.151 port 47592 [preauth]
Oct 06 13:05:46 np0005472061.novalocal sshd-session[4282]: Invalid user guest from 59.15.99.151 port 48004
Oct 06 13:05:46 np0005472061.novalocal sshd-session[4282]: Connection closed by invalid user guest 59.15.99.151 port 48004 [preauth]
Oct 06 13:05:49 np0005472061.novalocal sshd-session[4284]: Invalid user deploy from 59.15.99.151 port 48352
Oct 06 13:05:50 np0005472061.novalocal sshd-session[4284]: Connection closed by invalid user deploy 59.15.99.151 port 48352 [preauth]
Oct 06 13:05:52 np0005472061.novalocal sshd-session[4286]: Invalid user user from 59.15.99.151 port 48644
Oct 06 13:05:53 np0005472061.novalocal sshd-session[4286]: Connection closed by invalid user user 59.15.99.151 port 48644 [preauth]
Oct 06 13:05:55 np0005472061.novalocal sshd-session[4288]: Invalid user fa from 59.15.99.151 port 48988
Oct 06 13:05:56 np0005472061.novalocal sshd-session[4288]: Connection closed by invalid user fa 59.15.99.151 port 48988 [preauth]
Oct 06 13:05:59 np0005472061.novalocal sshd-session[4290]: Invalid user user from 59.15.99.151 port 49318
Oct 06 13:05:59 np0005472061.novalocal sshd-session[4290]: Connection closed by invalid user user 59.15.99.151 port 49318 [preauth]
Oct 06 13:06:03 np0005472061.novalocal sshd-session[4292]: Invalid user kali from 59.15.99.151 port 49670
Oct 06 13:06:03 np0005472061.novalocal sshd-session[4292]: Connection closed by invalid user kali 59.15.99.151 port 49670 [preauth]
Oct 06 13:06:07 np0005472061.novalocal sshd-session[4294]: Connection closed by authenticating user root 59.15.99.151 port 50070 [preauth]
Oct 06 13:06:11 np0005472061.novalocal sshd-session[4296]: Invalid user admin from 59.15.99.151 port 50482
Oct 06 13:06:11 np0005472061.novalocal sshd-session[4296]: Connection closed by invalid user admin 59.15.99.151 port 50482 [preauth]
Oct 06 13:06:14 np0005472061.novalocal sshd-session[4298]: Invalid user deploy from 59.15.99.151 port 50874
Oct 06 13:06:14 np0005472061.novalocal sshd-session[4298]: Connection closed by invalid user deploy 59.15.99.151 port 50874 [preauth]
Oct 06 13:06:18 np0005472061.novalocal sshd-session[4300]: Invalid user oracle from 59.15.99.151 port 51210
Oct 06 13:06:18 np0005472061.novalocal sshd-session[4300]: Connection closed by invalid user oracle 59.15.99.151 port 51210 [preauth]
Oct 06 13:06:22 np0005472061.novalocal sshd-session[4302]: Invalid user devopsuser from 59.15.99.151 port 51660
Oct 06 13:06:23 np0005472061.novalocal sshd-session[4302]: Connection closed by invalid user devopsuser 59.15.99.151 port 51660 [preauth]
Oct 06 13:06:26 np0005472061.novalocal sshd-session[4304]: Invalid user deploy from 59.15.99.151 port 52094
Oct 06 13:06:26 np0005472061.novalocal sshd-session[4304]: Connection closed by invalid user deploy 59.15.99.151 port 52094 [preauth]
Oct 06 13:06:30 np0005472061.novalocal sshd-session[4306]: Invalid user es from 59.15.99.151 port 52580
Oct 06 13:06:30 np0005472061.novalocal sshd-session[4306]: Connection closed by invalid user es 59.15.99.151 port 52580 [preauth]
Oct 06 13:06:36 np0005472061.novalocal sshd-session[4308]: Invalid user admin from 59.15.99.151 port 52902
Oct 06 13:06:37 np0005472061.novalocal sshd-session[4308]: Connection closed by invalid user admin 59.15.99.151 port 52902 [preauth]
Oct 06 13:06:41 np0005472061.novalocal sshd-session[4310]: Invalid user admin from 59.15.99.151 port 53578
Oct 06 13:06:41 np0005472061.novalocal sshd-session[4310]: Connection closed by invalid user admin 59.15.99.151 port 53578 [preauth]
Oct 06 13:06:45 np0005472061.novalocal sshd-session[4312]: Invalid user mysql from 59.15.99.151 port 54028
Oct 06 13:06:46 np0005472061.novalocal sshd-session[4312]: Connection closed by invalid user mysql 59.15.99.151 port 54028 [preauth]
Oct 06 13:06:49 np0005472061.novalocal sshd-session[4314]: Invalid user ubnt from 59.15.99.151 port 54514
Oct 06 13:06:49 np0005472061.novalocal sshd-session[4314]: Connection closed by invalid user ubnt 59.15.99.151 port 54514 [preauth]
Oct 06 13:06:53 np0005472061.novalocal sshd-session[4316]: Invalid user testuser from 59.15.99.151 port 54896
Oct 06 13:06:53 np0005472061.novalocal sshd-session[4316]: Connection closed by invalid user testuser 59.15.99.151 port 54896 [preauth]
Oct 06 13:06:56 np0005472061.novalocal sshd-session[4318]: Invalid user guest from 59.15.99.151 port 55322
Oct 06 13:06:57 np0005472061.novalocal sshd-session[4318]: Connection closed by invalid user guest 59.15.99.151 port 55322 [preauth]
Oct 06 13:07:00 np0005472061.novalocal sshd-session[4320]: Invalid user ubuntu from 59.15.99.151 port 55732
Oct 06 13:07:00 np0005472061.novalocal sshd-session[4320]: Connection closed by invalid user ubuntu 59.15.99.151 port 55732 [preauth]
Oct 06 13:07:03 np0005472061.novalocal sshd-session[4322]: Invalid user kali from 59.15.99.151 port 56092
Oct 06 13:07:04 np0005472061.novalocal sshd-session[4322]: Connection closed by invalid user kali 59.15.99.151 port 56092 [preauth]
Oct 06 13:07:07 np0005472061.novalocal sshd-session[4324]: Invalid user bouncer from 59.15.99.151 port 56538
Oct 06 13:07:08 np0005472061.novalocal sshd-session[4324]: Connection closed by invalid user bouncer 59.15.99.151 port 56538 [preauth]
Oct 06 13:07:11 np0005472061.novalocal sshd-session[4326]: Connection closed by authenticating user root 59.15.99.151 port 56910 [preauth]
Oct 06 13:07:16 np0005472061.novalocal sshd-session[4328]: Invalid user test from 59.15.99.151 port 57258
Oct 06 13:07:16 np0005472061.novalocal sshd-session[4328]: Connection closed by invalid user test 59.15.99.151 port 57258 [preauth]
Oct 06 13:07:20 np0005472061.novalocal sshd-session[4330]: Invalid user deploy from 59.15.99.151 port 57848
Oct 06 13:07:20 np0005472061.novalocal sshd-session[4330]: Connection closed by invalid user deploy 59.15.99.151 port 57848 [preauth]
Oct 06 13:07:24 np0005472061.novalocal sshd-session[4332]: Invalid user dspace from 59.15.99.151 port 58234
Oct 06 13:07:25 np0005472061.novalocal sshd-session[4332]: Connection closed by invalid user dspace 59.15.99.151 port 58234 [preauth]
Oct 06 13:07:29 np0005472061.novalocal sshd-session[4334]: Connection closed by authenticating user root 59.15.99.151 port 58602 [preauth]
Oct 06 13:07:32 np0005472061.novalocal sshd-session[4336]: Invalid user devopsadmin from 59.15.99.151 port 59062
Oct 06 13:07:33 np0005472061.novalocal sshd-session[4336]: Connection closed by invalid user devopsadmin 59.15.99.151 port 59062 [preauth]
Oct 06 13:07:39 np0005472061.novalocal sshd-session[4338]: Connection closed by authenticating user root 59.15.99.151 port 59394 [preauth]
Oct 06 13:07:42 np0005472061.novalocal sshd-session[4340]: Invalid user test from 59.15.99.151 port 59864
Oct 06 13:07:42 np0005472061.novalocal sshd-session[4340]: Connection closed by invalid user test 59.15.99.151 port 59864 [preauth]
Oct 06 13:07:46 np0005472061.novalocal sshd-session[4342]: Connection closed by authenticating user root 59.15.99.151 port 60142 [preauth]
Oct 06 13:07:49 np0005472061.novalocal sshd-session[4344]: Connection closed by authenticating user root 59.15.99.151 port 60606 [preauth]
Oct 06 13:07:54 np0005472061.novalocal sshd-session[4346]: Invalid user ramp from 59.15.99.151 port 60870
Oct 06 13:07:54 np0005472061.novalocal sshd-session[4346]: Connection closed by invalid user ramp 59.15.99.151 port 60870 [preauth]
Oct 06 13:07:57 np0005472061.novalocal sshd-session[4351]: Accepted publickey for zuul from 38.102.83.114 port 54036 ssh2: RSA SHA256:Jx12jaLmKdzqWUaxClrd355NjuSq2gOyPD0e5qs8aYc
Oct 06 13:07:57 np0005472061.novalocal systemd-logind[789]: New session 4 of user zuul.
Oct 06 13:07:57 np0005472061.novalocal systemd[1]: Started Session 4 of User zuul.
Oct 06 13:07:57 np0005472061.novalocal sshd-session[4351]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:07:57 np0005472061.novalocal sudo[4378]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eurtokldqvrkltohgywjieyqredqllma ; /usr/bin/python3'
Oct 06 13:07:57 np0005472061.novalocal sudo[4378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:07:57 np0005472061.novalocal sshd-session[4348]: Connection closed by authenticating user root 59.15.99.151 port 33174 [preauth]
Oct 06 13:07:57 np0005472061.novalocal python3[4380]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-bfc8-ee00-000000001cf1-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:07:57 np0005472061.novalocal sudo[4378]: pam_unix(sudo:session): session closed for user root
Oct 06 13:07:58 np0005472061.novalocal sudo[4407]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwdfjzweepufhwozsljlbzozgdzpshqy ; /usr/bin/python3'
Oct 06 13:07:58 np0005472061.novalocal sudo[4407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:07:58 np0005472061.novalocal python3[4409]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:07:58 np0005472061.novalocal sudo[4407]: pam_unix(sudo:session): session closed for user root
Oct 06 13:07:58 np0005472061.novalocal sudo[4433]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bplzzesvnpbuojgvnhyinyzyfaruhoni ; /usr/bin/python3'
Oct 06 13:07:58 np0005472061.novalocal sudo[4433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:07:58 np0005472061.novalocal python3[4435]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:07:58 np0005472061.novalocal sudo[4433]: pam_unix(sudo:session): session closed for user root
Oct 06 13:07:59 np0005472061.novalocal sudo[4459]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjgodhjoffahwpjphotqeuituxavodmr ; /usr/bin/python3'
Oct 06 13:07:59 np0005472061.novalocal sudo[4459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:07:59 np0005472061.novalocal python3[4461]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:07:59 np0005472061.novalocal sudo[4459]: pam_unix(sudo:session): session closed for user root
Oct 06 13:07:59 np0005472061.novalocal sudo[4485]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcdnnfuymsngjdzvxfxwaqkboginsccn ; /usr/bin/python3'
Oct 06 13:07:59 np0005472061.novalocal sudo[4485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:07:59 np0005472061.novalocal python3[4487]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:07:59 np0005472061.novalocal sudo[4485]: pam_unix(sudo:session): session closed for user root
Oct 06 13:07:59 np0005472061.novalocal sudo[4511]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbwxkhhvgnecorogoacectqxuhanywaj ; /usr/bin/python3'
Oct 06 13:07:59 np0005472061.novalocal sudo[4511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:08:00 np0005472061.novalocal python3[4513]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:08:00 np0005472061.novalocal python3[4513]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct 06 13:08:00 np0005472061.novalocal sudo[4511]: pam_unix(sudo:session): session closed for user root
Oct 06 13:08:00 np0005472061.novalocal sudo[4537]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhqhukclmyslkifyqkqsphagxblpiotl ; /usr/bin/python3'
Oct 06 13:08:00 np0005472061.novalocal sudo[4537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:08:00 np0005472061.novalocal python3[4539]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:08:00 np0005472061.novalocal systemd[1]: Reloading.
Oct 06 13:08:00 np0005472061.novalocal systemd-rc-local-generator[4562]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:08:01 np0005472061.novalocal sudo[4537]: pam_unix(sudo:session): session closed for user root
Oct 06 13:08:02 np0005472061.novalocal sudo[4593]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnpghhfsxhpuycqqyjrrkkmhluzbybjv ; /usr/bin/python3'
Oct 06 13:08:02 np0005472061.novalocal sudo[4593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:08:02 np0005472061.novalocal python3[4595]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct 06 13:08:02 np0005472061.novalocal sudo[4593]: pam_unix(sudo:session): session closed for user root
Oct 06 13:08:02 np0005472061.novalocal sudo[4619]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrbbbpzdddzdqlsirsjvwgyamjyamscb ; /usr/bin/python3'
Oct 06 13:08:02 np0005472061.novalocal sudo[4619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:08:02 np0005472061.novalocal python3[4621]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:08:02 np0005472061.novalocal sudo[4619]: pam_unix(sudo:session): session closed for user root
Oct 06 13:08:03 np0005472061.novalocal sudo[4647]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtpqfqmwpospsqwdnxuwjdkwgynonfrd ; /usr/bin/python3'
Oct 06 13:08:03 np0005472061.novalocal sudo[4647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:08:03 np0005472061.novalocal python3[4649]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:08:03 np0005472061.novalocal sudo[4647]: pam_unix(sudo:session): session closed for user root
Oct 06 13:08:03 np0005472061.novalocal sudo[4675]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aovrbcdjagnmshqwqrbnzgcwhjbhhyji ; /usr/bin/python3'
Oct 06 13:08:03 np0005472061.novalocal sudo[4675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:08:03 np0005472061.novalocal python3[4677]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:08:03 np0005472061.novalocal sudo[4675]: pam_unix(sudo:session): session closed for user root
Oct 06 13:08:03 np0005472061.novalocal sudo[4703]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqtregacybullrgvpbbxfzvnlkhidaav ; /usr/bin/python3'
Oct 06 13:08:03 np0005472061.novalocal sudo[4703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:08:03 np0005472061.novalocal python3[4705]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:08:03 np0005472061.novalocal sudo[4703]: pam_unix(sudo:session): session closed for user root
Oct 06 13:08:04 np0005472061.novalocal python3[4732]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-bfc8-ee00-000000001cf7-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:08:04 np0005472061.novalocal python3[4762]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:08:05 np0005472061.novalocal irqbalance[782]: Cannot change IRQ 27 affinity: Operation not permitted
Oct 06 13:08:05 np0005472061.novalocal irqbalance[782]: IRQ 27 affinity is now unmanaged
Oct 06 13:08:07 np0005472061.novalocal sshd-session[4354]: Connection closed by 38.102.83.114 port 54036
Oct 06 13:08:07 np0005472061.novalocal sshd-session[4351]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:08:07 np0005472061.novalocal systemd-logind[789]: Session 4 logged out. Waiting for processes to exit.
Oct 06 13:08:07 np0005472061.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Oct 06 13:08:07 np0005472061.novalocal systemd[1]: session-4.scope: Consumed 3.825s CPU time.
Oct 06 13:08:07 np0005472061.novalocal systemd-logind[789]: Removed session 4.
Oct 06 13:08:09 np0005472061.novalocal sshd-session[4768]: Accepted publickey for zuul from 38.102.83.114 port 36912 ssh2: RSA SHA256:Jx12jaLmKdzqWUaxClrd355NjuSq2gOyPD0e5qs8aYc
Oct 06 13:08:09 np0005472061.novalocal systemd-logind[789]: New session 5 of user zuul.
Oct 06 13:08:09 np0005472061.novalocal systemd[1]: Started Session 5 of User zuul.
Oct 06 13:08:09 np0005472061.novalocal sshd-session[4768]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:08:09 np0005472061.novalocal sudo[4795]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxvgjrbrdlkqsjrfzndrcsaksivtmlev ; /usr/bin/python3'
Oct 06 13:08:09 np0005472061.novalocal sudo[4795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:08:09 np0005472061.novalocal python3[4797]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 06 13:08:25 np0005472061.novalocal kernel: SELinux:  Converting 364 SID table entries...
Oct 06 13:08:25 np0005472061.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:08:25 np0005472061.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 06 13:08:25 np0005472061.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:08:25 np0005472061.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:08:25 np0005472061.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:08:25 np0005472061.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:08:25 np0005472061.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:08:34 np0005472061.novalocal kernel: SELinux:  Converting 364 SID table entries...
Oct 06 13:08:34 np0005472061.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:08:34 np0005472061.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 06 13:08:34 np0005472061.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:08:34 np0005472061.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:08:34 np0005472061.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:08:34 np0005472061.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:08:34 np0005472061.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:08:43 np0005472061.novalocal kernel: SELinux:  Converting 364 SID table entries...
Oct 06 13:08:43 np0005472061.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:08:43 np0005472061.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 06 13:08:43 np0005472061.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:08:43 np0005472061.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:08:43 np0005472061.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:08:43 np0005472061.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:08:43 np0005472061.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:08:44 np0005472061.novalocal setsebool[4864]: The virt_use_nfs policy boolean was changed to 1 by root
Oct 06 13:08:44 np0005472061.novalocal setsebool[4864]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct 06 13:08:54 np0005472061.novalocal kernel: SELinux:  Converting 367 SID table entries...
Oct 06 13:08:54 np0005472061.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:08:54 np0005472061.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 06 13:08:54 np0005472061.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:08:54 np0005472061.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:08:54 np0005472061.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:08:54 np0005472061.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:08:54 np0005472061.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:09:14 np0005472061.novalocal dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 06 13:09:14 np0005472061.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 06 13:09:14 np0005472061.novalocal systemd[1]: Starting man-db-cache-update.service...
Oct 06 13:09:14 np0005472061.novalocal systemd[1]: Reloading.
Oct 06 13:09:14 np0005472061.novalocal systemd-rc-local-generator[5615]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:09:14 np0005472061.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Oct 06 13:09:15 np0005472061.novalocal systemd[1]: Starting PackageKit Daemon...
Oct 06 13:09:15 np0005472061.novalocal PackageKit[6268]: daemon start
Oct 06 13:09:15 np0005472061.novalocal systemd[1]: Starting Authorization Manager...
Oct 06 13:09:15 np0005472061.novalocal polkitd[6342]: Started polkitd version 0.117
Oct 06 13:09:15 np0005472061.novalocal polkitd[6342]: Loading rules from directory /etc/polkit-1/rules.d
Oct 06 13:09:15 np0005472061.novalocal polkitd[6342]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 06 13:09:15 np0005472061.novalocal polkitd[6342]: Finished loading, compiling and executing 3 rules
Oct 06 13:09:15 np0005472061.novalocal polkitd[6342]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Oct 06 13:09:15 np0005472061.novalocal systemd[1]: Started Authorization Manager.
Oct 06 13:09:15 np0005472061.novalocal systemd[1]: Started PackageKit Daemon.
Oct 06 13:09:15 np0005472061.novalocal sudo[4795]: pam_unix(sudo:session): session closed for user root
Oct 06 13:09:22 np0005472061.novalocal python3[9993]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-b104-8a53-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:09:23 np0005472061.novalocal kernel: evm: overlay not supported
Oct 06 13:09:23 np0005472061.novalocal systemd[1058]: Starting D-Bus User Message Bus...
Oct 06 13:09:23 np0005472061.novalocal dbus-broker-launch[10617]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct 06 13:09:23 np0005472061.novalocal dbus-broker-launch[10617]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct 06 13:09:23 np0005472061.novalocal systemd[1058]: Started D-Bus User Message Bus.
Oct 06 13:09:23 np0005472061.novalocal dbus-broker-lau[10617]: Ready
Oct 06 13:09:23 np0005472061.novalocal systemd[1058]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 06 13:09:23 np0005472061.novalocal systemd[1058]: Created slice Slice /user.
Oct 06 13:09:23 np0005472061.novalocal systemd[1058]: podman-10569.scope: unit configures an IP firewall, but not running as root.
Oct 06 13:09:23 np0005472061.novalocal systemd[1058]: (This warning is only shown for the first unit using IP firewalling.)
Oct 06 13:09:23 np0005472061.novalocal systemd[1058]: Started podman-10569.scope.
Oct 06 13:09:23 np0005472061.novalocal systemd[1058]: Started podman-pause-d66c5c4b.scope.
Oct 06 13:09:23 np0005472061.novalocal sudo[10732]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgcdceohoqkxnnlobuasbgbmfssqdbwb ; /usr/bin/python3'
Oct 06 13:09:23 np0005472061.novalocal sudo[10732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:09:24 np0005472061.novalocal python3[10734]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.151:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.151:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:09:24 np0005472061.novalocal sudo[10732]: pam_unix(sudo:session): session closed for user root
Oct 06 13:09:24 np0005472061.novalocal sshd-session[4771]: Connection closed by 38.102.83.114 port 36912
Oct 06 13:09:24 np0005472061.novalocal sshd-session[4768]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:09:24 np0005472061.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Oct 06 13:09:24 np0005472061.novalocal systemd[1]: session-5.scope: Consumed 59.698s CPU time.
Oct 06 13:09:24 np0005472061.novalocal systemd-logind[789]: Session 5 logged out. Waiting for processes to exit.
Oct 06 13:09:24 np0005472061.novalocal systemd-logind[789]: Removed session 5.
Oct 06 13:09:34 np0005472061.novalocal sshd-session[14206]: banner exchange: Connection from 65.49.1.162 port 51520: invalid format
Oct 06 13:09:44 np0005472061.novalocal sshd-session[17339]: Connection closed by 38.102.83.162 port 41044 [preauth]
Oct 06 13:09:44 np0005472061.novalocal sshd-session[17341]: Connection closed by 38.102.83.162 port 41054 [preauth]
Oct 06 13:09:44 np0005472061.novalocal sshd-session[17347]: Unable to negotiate with 38.102.83.162 port 41074: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 06 13:09:44 np0005472061.novalocal sshd-session[17345]: Unable to negotiate with 38.102.83.162 port 41064: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 06 13:09:44 np0005472061.novalocal sshd-session[17349]: Unable to negotiate with 38.102.83.162 port 41090: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 06 13:09:49 np0005472061.novalocal sshd-session[18949]: Accepted publickey for zuul from 38.102.83.114 port 39162 ssh2: RSA SHA256:Jx12jaLmKdzqWUaxClrd355NjuSq2gOyPD0e5qs8aYc
Oct 06 13:09:49 np0005472061.novalocal systemd-logind[789]: New session 6 of user zuul.
Oct 06 13:09:49 np0005472061.novalocal systemd[1]: Started Session 6 of User zuul.
Oct 06 13:09:49 np0005472061.novalocal sshd-session[18949]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:09:49 np0005472061.novalocal python3[19055]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJl9NTU+NHJ0vhznbC2Ck8LnuXbuPe5y82hHiKf5WRwzdyC0XERAGKi6JfDH0hrXsKHF7N9QPt268Zu0kky1qhs= zuul@np0005472060.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:09:49 np0005472061.novalocal sudo[19208]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxdvrhlzdgzcnenjmduyuohlamdatojg ; /usr/bin/python3'
Oct 06 13:09:49 np0005472061.novalocal sudo[19208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:09:50 np0005472061.novalocal python3[19217]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJl9NTU+NHJ0vhznbC2Ck8LnuXbuPe5y82hHiKf5WRwzdyC0XERAGKi6JfDH0hrXsKHF7N9QPt268Zu0kky1qhs= zuul@np0005472060.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:09:50 np0005472061.novalocal sudo[19208]: pam_unix(sudo:session): session closed for user root
Oct 06 13:09:50 np0005472061.novalocal sudo[19494]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhozfpcyuonpeocyiqkrcioetvcgiekm ; /usr/bin/python3'
Oct 06 13:09:50 np0005472061.novalocal sudo[19494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:09:50 np0005472061.novalocal python3[19504]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005472061.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct 06 13:09:50 np0005472061.novalocal useradd[19584]: new group: name=cloud-admin, GID=1002
Oct 06 13:09:50 np0005472061.novalocal useradd[19584]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Oct 06 13:09:51 np0005472061.novalocal sudo[19494]: pam_unix(sudo:session): session closed for user root
Oct 06 13:09:51 np0005472061.novalocal sudo[19726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spqwucdushsjkiriewrbjaytjailcgac ; /usr/bin/python3'
Oct 06 13:09:51 np0005472061.novalocal sudo[19726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:09:51 np0005472061.novalocal python3[19733]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJl9NTU+NHJ0vhznbC2Ck8LnuXbuPe5y82hHiKf5WRwzdyC0XERAGKi6JfDH0hrXsKHF7N9QPt268Zu0kky1qhs= zuul@np0005472060.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 06 13:09:51 np0005472061.novalocal sudo[19726]: pam_unix(sudo:session): session closed for user root
Oct 06 13:09:51 np0005472061.novalocal sudo[19926]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzcwoiduxhknyexfpwmgekkvntmlvjxd ; /usr/bin/python3'
Oct 06 13:09:51 np0005472061.novalocal sudo[19926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:09:52 np0005472061.novalocal python3[19933]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:09:52 np0005472061.novalocal sudo[19926]: pam_unix(sudo:session): session closed for user root
Oct 06 13:09:52 np0005472061.novalocal sudo[20101]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afwgaedtfwaxwrtlmridmwcdtnaiftxg ; /usr/bin/python3'
Oct 06 13:09:52 np0005472061.novalocal sudo[20101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:09:52 np0005472061.novalocal python3[20106]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759756191.7076483-151-67730222540944/source _original_basename=tmpec433l9p follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:09:52 np0005472061.novalocal sudo[20101]: pam_unix(sudo:session): session closed for user root
Oct 06 13:09:53 np0005472061.novalocal sudo[20315]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izfuznooghylupokmiuxoikteeyqhvoy ; /usr/bin/python3'
Oct 06 13:09:53 np0005472061.novalocal sudo[20315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:09:53 np0005472061.novalocal python3[20320]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Oct 06 13:09:53 np0005472061.novalocal systemd[1]: Starting Hostname Service...
Oct 06 13:09:53 np0005472061.novalocal systemd[1]: Started Hostname Service.
Oct 06 13:09:53 np0005472061.novalocal systemd-hostnamed[20379]: Changed pretty hostname to 'compute-0'
Oct 06 13:09:53 compute-0 systemd-hostnamed[20379]: Hostname set to <compute-0> (static)
Oct 06 13:09:53 compute-0 NetworkManager[3953]: <info>  [1759756193.5053] hostname: static hostname changed from "np0005472061.novalocal" to "compute-0"
Oct 06 13:09:53 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 06 13:09:53 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 06 13:09:53 compute-0 sudo[20315]: pam_unix(sudo:session): session closed for user root
Oct 06 13:09:53 compute-0 sshd-session[18992]: Connection closed by 38.102.83.114 port 39162
Oct 06 13:09:53 compute-0 sshd-session[18949]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:09:53 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Oct 06 13:09:53 compute-0 systemd[1]: session-6.scope: Consumed 2.588s CPU time.
Oct 06 13:09:53 compute-0 systemd-logind[789]: Session 6 logged out. Waiting for processes to exit.
Oct 06 13:09:53 compute-0 systemd-logind[789]: Removed session 6.
Oct 06 13:10:03 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 06 13:10:17 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 06 13:10:17 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 06 13:10:17 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1min 11.327s CPU time.
Oct 06 13:10:17 compute-0 systemd[1]: run-r1d80dc42c0d04ac4ba38790638c7eb24.service: Deactivated successfully.
Oct 06 13:10:23 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 06 13:12:01 compute-0 anacron[3714]: Job `cron.daily' started
Oct 06 13:12:01 compute-0 anacron[3714]: Job `cron.daily' terminated
Oct 06 13:13:35 compute-0 sshd-session[26691]: Accepted publickey for zuul from 38.102.83.162 port 39906 ssh2: RSA SHA256:Jx12jaLmKdzqWUaxClrd355NjuSq2gOyPD0e5qs8aYc
Oct 06 13:13:35 compute-0 systemd-logind[789]: New session 7 of user zuul.
Oct 06 13:13:35 compute-0 systemd[1]: Started Session 7 of User zuul.
Oct 06 13:13:35 compute-0 sshd-session[26691]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:13:36 compute-0 python3[26767]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:13:38 compute-0 sudo[26881]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idkybqjhdomcskkwcpdnakiofzgptbdn ; /usr/bin/python3'
Oct 06 13:13:38 compute-0 sudo[26881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:38 compute-0 python3[26883]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:13:38 compute-0 sudo[26881]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:38 compute-0 sudo[26954]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idlpexivqjiafnaeutiwyetbftgmcxei ; /usr/bin/python3'
Oct 06 13:13:38 compute-0 sudo[26954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:38 compute-0 python3[26956]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759756417.8410952-30451-1513419586891/source mode=0755 _original_basename=delorean.repo follow=False checksum=f3053b54268f7da202e825fb393ed57ec9eebf73 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:13:38 compute-0 sudo[26954]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:38 compute-0 sudo[26980]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykdryfxgvgbbcdeaonyduburolnrhutg ; /usr/bin/python3'
Oct 06 13:13:38 compute-0 sudo[26980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:38 compute-0 python3[26982]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:13:38 compute-0 sudo[26980]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:39 compute-0 sudo[27053]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aumcfhfrkpkezncggwssnhvlqwksclml ; /usr/bin/python3'
Oct 06 13:13:39 compute-0 sudo[27053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:39 compute-0 python3[27055]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759756417.8410952-30451-1513419586891/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=c22157e85d05af7ffbafa054f80958446d397a41 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:13:39 compute-0 sudo[27053]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:39 compute-0 sudo[27079]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnuefbpsnxonpevjrnpzvpkdyhvfexxg ; /usr/bin/python3'
Oct 06 13:13:39 compute-0 sudo[27079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:39 compute-0 python3[27081]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:13:39 compute-0 sudo[27079]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:39 compute-0 sudo[27152]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdpfjaegldxvbaouyjguvsmqzxbhjmgv ; /usr/bin/python3'
Oct 06 13:13:39 compute-0 sudo[27152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:39 compute-0 python3[27154]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759756417.8410952-30451-1513419586891/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:13:39 compute-0 sudo[27152]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:39 compute-0 sudo[27178]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctabjqpuztfkddpruggrrzsqtzrtbryc ; /usr/bin/python3'
Oct 06 13:13:39 compute-0 sudo[27178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:40 compute-0 python3[27180]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:13:40 compute-0 sudo[27178]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:40 compute-0 sudo[27251]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwbqqfugoqrjsiqsgkzcsqjnpdasqvgi ; /usr/bin/python3'
Oct 06 13:13:40 compute-0 sudo[27251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:40 compute-0 python3[27253]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759756417.8410952-30451-1513419586891/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:13:40 compute-0 sudo[27251]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:40 compute-0 sudo[27277]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtyzbzuwgyjgcpvmnipbxgvathosjouv ; /usr/bin/python3'
Oct 06 13:13:40 compute-0 sudo[27277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:40 compute-0 python3[27279]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:13:40 compute-0 sudo[27277]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:41 compute-0 sudo[27350]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raawnqjwqdvxegrubpqjsrhxwfsxoajf ; /usr/bin/python3'
Oct 06 13:13:41 compute-0 sudo[27350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:41 compute-0 python3[27352]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759756417.8410952-30451-1513419586891/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:13:41 compute-0 sudo[27350]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:41 compute-0 sudo[27376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxmgblggcyuteenrbrogkadkwqzbssdw ; /usr/bin/python3'
Oct 06 13:13:41 compute-0 sudo[27376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:41 compute-0 python3[27378]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:13:41 compute-0 sudo[27376]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:41 compute-0 sudo[27449]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdvitrkkeickbmbjejcuwwddxpmbbnxn ; /usr/bin/python3'
Oct 06 13:13:41 compute-0 sudo[27449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:42 compute-0 python3[27451]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759756417.8410952-30451-1513419586891/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:13:42 compute-0 sudo[27449]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:42 compute-0 sudo[27475]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojmhfnbptahsgtfenhktimjdbtnxjpmm ; /usr/bin/python3'
Oct 06 13:13:42 compute-0 sudo[27475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:42 compute-0 python3[27477]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:13:42 compute-0 sudo[27475]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:42 compute-0 sudo[27548]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkxtsnootwhrrswgzuoqrdomxspkzhms ; /usr/bin/python3'
Oct 06 13:13:42 compute-0 sudo[27548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:42 compute-0 python3[27550]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759756417.8410952-30451-1513419586891/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=76a47a7cad5dcf9aac1c7fdfc6c635805db6c4f1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:13:42 compute-0 sudo[27548]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:42 compute-0 sudo[27574]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpwcrpfdgawbfqhcdorodrgzqavmatsn ; /usr/bin/python3'
Oct 06 13:13:42 compute-0 sudo[27574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:43 compute-0 python3[27576]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 06 13:13:43 compute-0 sudo[27574]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:43 compute-0 sudo[27647]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crpwcwxmaeuqhhyhvmhvqrjanpawgkfu ; /usr/bin/python3'
Oct 06 13:13:43 compute-0 sudo[27647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:13:43 compute-0 python3[27649]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759756417.8410952-30451-1513419586891/source mode=0755 _original_basename=gating.repo follow=False checksum=d3e3b0226288550a485320956c5dc5305450cfe5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:13:43 compute-0 sudo[27647]: pam_unix(sudo:session): session closed for user root
Oct 06 13:13:45 compute-0 sshd-session[27674]: Connection closed by 192.168.122.11 port 58284 [preauth]
Oct 06 13:13:45 compute-0 sshd-session[27675]: Connection closed by 192.168.122.11 port 58288 [preauth]
Oct 06 13:13:45 compute-0 sshd-session[27676]: Unable to negotiate with 192.168.122.11 port 58302: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 06 13:13:45 compute-0 sshd-session[27677]: Unable to negotiate with 192.168.122.11 port 58306: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 06 13:13:45 compute-0 sshd-session[27678]: Unable to negotiate with 192.168.122.11 port 58320: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 06 13:14:21 compute-0 PackageKit[6268]: daemon quit
Oct 06 13:14:21 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Oct 06 13:14:21 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 06 13:14:21 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct 06 13:14:21 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Oct 06 13:14:21 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct 06 13:14:49 compute-0 python3[27710]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:16:02 compute-0 sshd-session[27713]: Connection closed by authenticating user root 193.32.162.157 port 42992 [preauth]
Oct 06 13:16:19 compute-0 sshd-session[27715]: Connection closed by authenticating user root 193.32.162.157 port 43552 [preauth]
Oct 06 13:16:37 compute-0 sshd-session[27717]: Connection closed by authenticating user root 193.32.162.157 port 46676 [preauth]
Oct 06 13:16:53 compute-0 sshd-session[27719]: Connection closed by authenticating user root 193.32.162.157 port 49658 [preauth]
Oct 06 13:17:10 compute-0 sshd-session[27721]: Connection closed by authenticating user root 193.32.162.157 port 45838 [preauth]
Oct 06 13:17:27 compute-0 sshd-session[27723]: Connection closed by authenticating user root 193.32.162.157 port 54908 [preauth]
Oct 06 13:17:43 compute-0 sshd-session[27725]: Connection closed by authenticating user root 193.32.162.157 port 48744 [preauth]
Oct 06 13:18:00 compute-0 sshd-session[27727]: Connection closed by authenticating user root 193.32.162.157 port 44806 [preauth]
Oct 06 13:18:17 compute-0 sshd-session[27729]: Connection closed by authenticating user root 193.32.162.157 port 36560 [preauth]
Oct 06 13:18:33 compute-0 sshd-session[27731]: Connection closed by authenticating user root 193.32.162.157 port 41690 [preauth]
Oct 06 13:18:50 compute-0 sshd-session[27733]: Connection closed by authenticating user root 193.32.162.157 port 48132 [preauth]
Oct 06 13:19:06 compute-0 sshd-session[27736]: Connection closed by authenticating user root 193.32.162.157 port 42144 [preauth]
Oct 06 13:19:23 compute-0 sshd-session[27738]: Connection closed by authenticating user root 193.32.162.157 port 37936 [preauth]
Oct 06 13:19:39 compute-0 sshd-session[27740]: Connection closed by authenticating user root 193.32.162.157 port 54980 [preauth]
Oct 06 13:19:48 compute-0 sshd-session[26694]: Received disconnect from 38.102.83.162 port 39906:11: disconnected by user
Oct 06 13:19:48 compute-0 sshd-session[26694]: Disconnected from user zuul 38.102.83.162 port 39906
Oct 06 13:19:48 compute-0 sshd-session[26691]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:19:48 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Oct 06 13:19:48 compute-0 systemd[1]: session-7.scope: Consumed 6.582s CPU time.
Oct 06 13:19:48 compute-0 systemd-logind[789]: Session 7 logged out. Waiting for processes to exit.
Oct 06 13:19:48 compute-0 systemd-logind[789]: Removed session 7.
Oct 06 13:19:56 compute-0 sshd-session[27742]: Connection closed by authenticating user root 193.32.162.157 port 49926 [preauth]
Oct 06 13:20:13 compute-0 sshd-session[27744]: Connection closed by authenticating user root 193.32.162.157 port 43266 [preauth]
Oct 06 13:20:29 compute-0 sshd-session[27746]: Connection closed by authenticating user root 193.32.162.157 port 45536 [preauth]
Oct 06 13:20:46 compute-0 sshd-session[27748]: Connection closed by authenticating user root 193.32.162.157 port 42230 [preauth]
Oct 06 13:21:03 compute-0 sshd-session[27750]: Connection closed by authenticating user root 193.32.162.157 port 57474 [preauth]
Oct 06 13:21:20 compute-0 sshd-session[27753]: Connection closed by authenticating user root 193.32.162.157 port 58472 [preauth]
Oct 06 13:21:36 compute-0 sshd-session[27755]: Connection closed by authenticating user root 193.32.162.157 port 37556 [preauth]
Oct 06 13:21:53 compute-0 sshd-session[27757]: Connection closed by authenticating user root 193.32.162.157 port 36650 [preauth]
Oct 06 13:22:10 compute-0 sshd-session[27759]: Connection closed by authenticating user root 193.32.162.157 port 52932 [preauth]
Oct 06 13:22:27 compute-0 sshd-session[27761]: Connection closed by authenticating user root 193.32.162.157 port 45060 [preauth]
Oct 06 13:22:44 compute-0 sshd-session[27763]: Connection closed by authenticating user root 193.32.162.157 port 50878 [preauth]
Oct 06 13:23:01 compute-0 sshd-session[27765]: Connection closed by authenticating user root 193.32.162.157 port 34342 [preauth]
Oct 06 13:23:17 compute-0 sshd-session[27767]: Connection closed by authenticating user root 193.32.162.157 port 56682 [preauth]
Oct 06 13:23:34 compute-0 sshd-session[27769]: Connection closed by authenticating user root 193.32.162.157 port 47326 [preauth]
Oct 06 13:23:41 compute-0 sshd-session[27773]: Connection closed by authenticating user root 203.33.206.106 port 57852 [preauth]
Oct 06 13:23:51 compute-0 sshd-session[27771]: Connection closed by authenticating user root 193.32.162.157 port 38068 [preauth]
Oct 06 13:23:51 compute-0 sshd-session[27775]: Connection closed by authenticating user root 203.33.206.106 port 34624 [preauth]
Oct 06 13:23:53 compute-0 sshd-session[27778]: Connection closed by authenticating user root 203.33.206.106 port 50254 [preauth]
Oct 06 13:23:57 compute-0 sshd-session[27781]: Connection closed by authenticating user root 203.33.206.106 port 53054 [preauth]
Oct 06 13:24:00 compute-0 sshd-session[27783]: Connection closed by authenticating user root 203.33.206.106 port 60356 [preauth]
Oct 06 13:24:07 compute-0 sshd-session[27785]: Connection closed by authenticating user root 203.33.206.106 port 36752 [preauth]
Oct 06 13:24:07 compute-0 sshd-session[27777]: Connection closed by authenticating user root 193.32.162.157 port 56938 [preauth]
Oct 06 13:24:16 compute-0 sshd-session[27789]: Connection closed by authenticating user root 203.33.206.106 port 45652 [preauth]
Oct 06 13:24:21 compute-0 sshd-session[27791]: Connection closed by authenticating user root 203.33.206.106 port 32972 [preauth]
Oct 06 13:24:23 compute-0 sshd-session[27793]: Connection closed by authenticating user root 203.33.206.106 port 41216 [preauth]
Oct 06 13:24:24 compute-0 sshd-session[27787]: Connection closed by authenticating user root 193.32.162.157 port 41056 [preauth]
Oct 06 13:24:40 compute-0 sshd-session[27796]: Connection closed by authenticating user root 203.33.206.106 port 45282 [preauth]
Oct 06 13:24:41 compute-0 sshd-session[27795]: Connection closed by authenticating user root 193.32.162.157 port 45874 [preauth]
Oct 06 13:24:46 compute-0 sshd-session[27800]: Connection closed by authenticating user root 203.33.206.106 port 44370 [preauth]
Oct 06 13:24:54 compute-0 sshd-session[27803]: Connection closed by authenticating user root 203.33.206.106 port 53678 [preauth]
Oct 06 13:24:58 compute-0 sshd-session[27799]: Connection closed by authenticating user root 193.32.162.157 port 50548 [preauth]
Oct 06 13:25:00 compute-0 sshd-session[27805]: Connection closed by authenticating user root 203.33.206.106 port 36522 [preauth]
Oct 06 13:25:06 compute-0 sshd-session[27809]: Connection closed by authenticating user root 203.33.206.106 port 47510 [preauth]
Oct 06 13:25:15 compute-0 sshd-session[27807]: Connection closed by authenticating user root 193.32.162.157 port 42522 [preauth]
Oct 06 13:25:16 compute-0 sshd-session[27811]: Connection closed by authenticating user root 203.33.206.106 port 56508 [preauth]
Oct 06 13:25:31 compute-0 sshd-session[27813]: Connection closed by authenticating user root 193.32.162.157 port 35432 [preauth]
Oct 06 13:25:35 compute-0 sshd-session[27815]: Connection closed by authenticating user root 203.33.206.106 port 45644 [preauth]
Oct 06 13:25:38 compute-0 sshd-session[27819]: Connection closed by authenticating user root 203.33.206.106 port 48698 [preauth]
Oct 06 13:25:48 compute-0 sshd-session[27817]: Connection closed by authenticating user root 193.32.162.157 port 45348 [preauth]
Oct 06 13:25:57 compute-0 sshd-session[27821]: Connection closed by authenticating user root 203.33.206.106 port 54596 [preauth]
Oct 06 13:26:04 compute-0 sshd-session[27823]: Connection closed by authenticating user root 193.32.162.157 port 60496 [preauth]
Oct 06 13:26:10 compute-0 sshd-session[27828]: Connection closed by authenticating user root 203.33.206.106 port 47868 [preauth]
Oct 06 13:26:16 compute-0 sshd-session[27825]: Connection closed by authenticating user root 203.33.206.106 port 58796 [preauth]
Oct 06 13:26:17 compute-0 sshd-session[27831]: Connection closed by authenticating user root 203.33.206.106 port 53050 [preauth]
Oct 06 13:26:21 compute-0 sshd-session[27827]: Connection closed by authenticating user root 193.32.162.157 port 47478 [preauth]
Oct 06 13:26:23 compute-0 sshd-session[27833]: Connection closed by authenticating user root 203.33.206.106 port 35540 [preauth]
Oct 06 13:26:30 compute-0 sshd-session[27837]: Connection closed by authenticating user root 203.33.206.106 port 46136 [preauth]
Oct 06 13:26:38 compute-0 sshd-session[27835]: Connection closed by authenticating user root 193.32.162.157 port 38710 [preauth]
Oct 06 13:26:44 compute-0 sshd-session[27840]: Connection closed by authenticating user root 203.33.206.106 port 56656 [preauth]
Oct 06 13:26:54 compute-0 sshd-session[27848]: Accepted publickey for zuul from 192.168.122.30 port 57010 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:26:54 compute-0 systemd-logind[789]: New session 8 of user zuul.
Oct 06 13:26:54 compute-0 systemd[1]: Started Session 8 of User zuul.
Oct 06 13:26:54 compute-0 sshd-session[27848]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:26:55 compute-0 sshd-session[27842]: Connection closed by authenticating user root 193.32.162.157 port 58948 [preauth]
Oct 06 13:26:55 compute-0 sshd-session[27844]: Connection closed by authenticating user root 203.33.206.106 port 52974 [preauth]
Oct 06 13:26:55 compute-0 python3.9[28002]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:26:56 compute-0 sshd-session[27846]: Connection closed by authenticating user root 203.33.206.106 port 39954 [preauth]
Oct 06 13:26:57 compute-0 sudo[28181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxlshlcdgjhxirenthletvgrwuzvwcpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757216.6228936-44-80978731856029/AnsiballZ_command.py'
Oct 06 13:26:57 compute-0 sudo[28181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:26:57 compute-0 python3.9[28183]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:27:04 compute-0 sudo[28181]: pam_unix(sudo:session): session closed for user root
Oct 06 13:27:04 compute-0 sshd-session[27851]: Connection closed by 192.168.122.30 port 57010
Oct 06 13:27:04 compute-0 sshd-session[27848]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:27:04 compute-0 systemd-logind[789]: Session 8 logged out. Waiting for processes to exit.
Oct 06 13:27:04 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Oct 06 13:27:04 compute-0 systemd[1]: session-8.scope: Consumed 8.201s CPU time.
Oct 06 13:27:04 compute-0 systemd-logind[789]: Removed session 8.
Oct 06 13:27:07 compute-0 sshd-session[28191]: Connection closed by authenticating user root 203.33.206.106 port 43222 [preauth]
Oct 06 13:27:09 compute-0 sshd-session[28246]: Accepted publickey for zuul from 192.168.122.30 port 44530 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:27:09 compute-0 systemd-logind[789]: New session 9 of user zuul.
Oct 06 13:27:09 compute-0 systemd[1]: Started Session 9 of User zuul.
Oct 06 13:27:09 compute-0 sshd-session[28246]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:27:10 compute-0 sshd-session[28244]: Connection closed by authenticating user root 203.33.206.106 port 33442 [preauth]
Oct 06 13:27:10 compute-0 python3.9[28399]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:27:11 compute-0 sshd-session[28249]: Connection closed by 192.168.122.30 port 44530
Oct 06 13:27:11 compute-0 sshd-session[28246]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:27:11 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Oct 06 13:27:11 compute-0 systemd-logind[789]: Session 9 logged out. Waiting for processes to exit.
Oct 06 13:27:11 compute-0 systemd-logind[789]: Removed session 9.
Oct 06 13:27:12 compute-0 sshd-session[27904]: Connection closed by authenticating user root 193.32.162.157 port 57342 [preauth]
Oct 06 13:27:16 compute-0 sshd-session[28427]: Connection closed by authenticating user root 203.33.206.106 port 38524 [preauth]
Oct 06 13:27:21 compute-0 sshd-session[28431]: Connection closed by authenticating user root 203.33.206.106 port 50048 [preauth]
Oct 06 13:27:26 compute-0 sshd-session[28435]: Accepted publickey for zuul from 192.168.122.30 port 53688 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:27:26 compute-0 systemd-logind[789]: New session 10 of user zuul.
Oct 06 13:27:26 compute-0 systemd[1]: Started Session 10 of User zuul.
Oct 06 13:27:26 compute-0 sshd-session[28435]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:27:27 compute-0 sshd-session[28433]: Connection closed by authenticating user root 203.33.206.106 port 59496 [preauth]
Oct 06 13:27:27 compute-0 python3.9[28589]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 06 13:27:28 compute-0 sshd-session[28429]: Connection closed by authenticating user root 193.32.162.157 port 50794 [preauth]
Oct 06 13:27:29 compute-0 python3.9[28764]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:27:30 compute-0 sudo[28915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxdxsiwwqykxmmmsnhwzwzjussybstnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757249.562428-69-642802502476/AnsiballZ_command.py'
Oct 06 13:27:30 compute-0 sudo[28915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:27:30 compute-0 python3.9[28917]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:27:30 compute-0 sudo[28915]: pam_unix(sudo:session): session closed for user root
Oct 06 13:27:30 compute-0 sshd-session[28491]: Connection closed by authenticating user root 203.33.206.106 port 40652 [preauth]
Oct 06 13:27:31 compute-0 sudo[29068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlqvrdmppvqirmoknpwnbhknydezfdfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757250.7473228-93-81327290673775/AnsiballZ_stat.py'
Oct 06 13:27:31 compute-0 sudo[29068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:27:31 compute-0 python3.9[29070]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:27:31 compute-0 sudo[29068]: pam_unix(sudo:session): session closed for user root
Oct 06 13:27:32 compute-0 sudo[29220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzvoaygdkhzvuwcaswmtllhbcisftwzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757251.633805-109-95807786061084/AnsiballZ_file.py'
Oct 06 13:27:32 compute-0 sudo[29220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:27:32 compute-0 python3.9[29222]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:27:32 compute-0 sudo[29220]: pam_unix(sudo:session): session closed for user root
Oct 06 13:27:33 compute-0 sudo[29373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvwidisxzjhxszbjzeeoczbtlfzapeyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757252.6507993-125-76527395891555/AnsiballZ_stat.py'
Oct 06 13:27:33 compute-0 sudo[29373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:27:33 compute-0 python3.9[29375]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:27:33 compute-0 sudo[29373]: pam_unix(sudo:session): session closed for user root
Oct 06 13:27:33 compute-0 sudo[29496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylwgilvhdviivhlzemptujundzumvhau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757252.6507993-125-76527395891555/AnsiballZ_copy.py'
Oct 06 13:27:33 compute-0 sudo[29496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:27:34 compute-0 python3.9[29498]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757252.6507993-125-76527395891555/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:27:34 compute-0 sudo[29496]: pam_unix(sudo:session): session closed for user root
Oct 06 13:27:34 compute-0 sudo[29648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gipjbskxochvfpoaiwrynwclgmewqlib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757254.3533664-155-251032610553728/AnsiballZ_setup.py'
Oct 06 13:27:34 compute-0 sudo[29648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:27:35 compute-0 python3.9[29650]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:27:35 compute-0 sudo[29648]: pam_unix(sudo:session): session closed for user root
Oct 06 13:27:35 compute-0 sudo[29804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivionnwnnauiqhrewhwjkveuskjhmvwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757255.4750755-171-181587476341306/AnsiballZ_file.py'
Oct 06 13:27:35 compute-0 sudo[29804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:27:36 compute-0 python3.9[29806]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:27:36 compute-0 sudo[29804]: pam_unix(sudo:session): session closed for user root
Oct 06 13:27:36 compute-0 python3.9[29956]: ansible-ansible.builtin.service_facts Invoked
Oct 06 13:27:42 compute-0 python3.9[30211]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:27:42 compute-0 python3.9[30361]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:27:44 compute-0 python3.9[30515]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:27:45 compute-0 sudo[30671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypwnzzjyittihnmiandjdyaukpfyxnru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757264.8427079-267-9104918684242/AnsiballZ_setup.py'
Oct 06 13:27:45 compute-0 sudo[30671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:27:45 compute-0 python3.9[30673]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:27:45 compute-0 sudo[30671]: pam_unix(sudo:session): session closed for user root
Oct 06 13:27:45 compute-0 sshd-session[28765]: Connection closed by authenticating user root 193.32.162.157 port 36970 [preauth]
Oct 06 13:27:46 compute-0 sudo[30756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytqyidzwqrgwxawqlvootebettsliyuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757264.8427079-267-9104918684242/AnsiballZ_dnf.py'
Oct 06 13:27:46 compute-0 sudo[30756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:27:46 compute-0 python3.9[30758]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:28:02 compute-0 sshd-session[30750]: Connection closed by authenticating user root 193.32.162.157 port 44244 [preauth]
Oct 06 13:28:19 compute-0 sshd-session[30867]: Connection closed by authenticating user root 193.32.162.157 port 42874 [preauth]
Oct 06 13:28:30 compute-0 systemd[1]: Reloading.
Oct 06 13:28:30 compute-0 systemd-rc-local-generator[30959]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:28:30 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct 06 13:28:30 compute-0 systemd[1]: Reloading.
Oct 06 13:28:30 compute-0 systemd-rc-local-generator[30995]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:28:31 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct 06 13:28:31 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct 06 13:28:31 compute-0 systemd[1]: Reloading.
Oct 06 13:28:31 compute-0 systemd-rc-local-generator[31036]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:28:31 compute-0 systemd[1]: Starting dnf makecache...
Oct 06 13:28:31 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Oct 06 13:28:31 compute-0 dnf[31045]: Repository 'gating-repo' is missing name in configuration, using id.
Oct 06 13:28:31 compute-0 dnf[31045]: Failed determining last makecache time.
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-barbican-42b4c41831408a8e323 157 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 194 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-cinder-1c00d6490d88e436f26ef 186 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-python-stevedore-c4acc5639fd2329372142 160 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Oct 06 13:28:31 compute-0 dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Oct 06 13:28:31 compute-0 dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-python-cloudkitty-tests-tempest-3961dc 182 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-diskimage-builder-43381184423c185801b5 199 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 201 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-python-designate-tests-tempest-347fdbc 196 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-glance-1fd12c29b339f30fe823e 208 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 204 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-manila-3c01b7181572c95dac462 195 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-python-vmware-nsxlib-458234972d1428ac9 199 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-octavia-ba397f07a7331190208c 203 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-watcher-c014f81a8647287f6dcc 218 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-edpm-image-builder-55ba53cf215b14ed95b 207 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 193 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-swift-dc98a8463506ac520c469a 212 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-python-tempestconf-8515371b7cceebd4282 213 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: delorean-openstack-heat-ui-013accbfd179753bc3f0 218 kB/s | 3.0 kB     00:00
Oct 06 13:28:31 compute-0 dnf[31045]: gating-repo                                     303 kB/s | 1.5 kB     00:00
Oct 06 13:28:32 compute-0 dnf[31045]: CentOS Stream 9 - BaseOS                         26 kB/s | 6.7 kB     00:00
Oct 06 13:28:32 compute-0 dnf[31045]: CentOS Stream 9 - AppStream                      54 kB/s | 6.8 kB     00:00
Oct 06 13:28:32 compute-0 dnf[31045]: CentOS Stream 9 - CRB                            54 kB/s | 6.6 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: CentOS Stream 9 - Extras packages                23 kB/s | 8.0 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: dlrn-antelope-testing                           114 kB/s | 3.0 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: dlrn-antelope-build-deps                        113 kB/s | 3.0 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: centos9-rabbitmq                                 88 kB/s | 3.0 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: centos9-storage                                  83 kB/s | 3.0 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: centos9-opstools                                 89 kB/s | 3.0 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: NFV SIG OpenvSwitch                             109 kB/s | 3.0 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: repo-setup-centos-appstream                     176 kB/s | 4.4 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: repo-setup-centos-baseos                        170 kB/s | 3.9 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: repo-setup-centos-highavailability              163 kB/s | 3.9 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: repo-setup-centos-powertools                    198 kB/s | 4.3 kB     00:00
Oct 06 13:28:33 compute-0 dnf[31045]: Extra Packages for Enterprise Linux 9 - x86_64  274 kB/s |  32 kB     00:00
Oct 06 13:28:34 compute-0 dnf[31045]: Metadata cache created.
Oct 06 13:28:34 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 06 13:28:34 compute-0 systemd[1]: Finished dnf makecache.
Oct 06 13:28:34 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.808s CPU time.
Oct 06 13:28:36 compute-0 sshd-session[30903]: Connection closed by authenticating user root 193.32.162.157 port 60204 [preauth]
Oct 06 13:28:53 compute-0 sshd-session[31114]: Connection closed by authenticating user root 193.32.162.157 port 36648 [preauth]
Oct 06 13:29:10 compute-0 sshd-session[31172]: Connection closed by authenticating user root 193.32.162.157 port 42954 [preauth]
Oct 06 13:29:27 compute-0 sshd-session[31262]: Connection closed by authenticating user root 193.32.162.157 port 35028 [preauth]
Oct 06 13:29:33 compute-0 kernel: SELinux:  Converting 2715 SID table entries...
Oct 06 13:29:33 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:29:33 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 06 13:29:33 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:29:33 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:29:33 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:29:33 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:29:33 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:29:33 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct 06 13:29:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 06 13:29:33 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 06 13:29:33 compute-0 systemd[1]: Reloading.
Oct 06 13:29:34 compute-0 systemd-rc-local-generator[31412]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:29:34 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 06 13:29:34 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct 06 13:29:34 compute-0 PackageKit[31588]: daemon start
Oct 06 13:29:34 compute-0 systemd[1]: Started PackageKit Daemon.
Oct 06 13:29:34 compute-0 sudo[30756]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:35 compute-0 sudo[32331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkjnezivzwaxnolmfbcgvtgcthmcwmvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757374.9164033-291-107028244243712/AnsiballZ_command.py'
Oct 06 13:29:35 compute-0 sudo[32331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:35 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 06 13:29:35 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 06 13:29:35 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.588s CPU time.
Oct 06 13:29:35 compute-0 systemd[1]: run-r2264468397bb4cf980bb07586ab0773d.service: Deactivated successfully.
Oct 06 13:29:35 compute-0 python3.9[32333]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:29:36 compute-0 sudo[32331]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:37 compute-0 sudo[32613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yneokcdthialovolirohfrvkhjtrdyuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757376.711638-307-28689337332256/AnsiballZ_selinux.py'
Oct 06 13:29:37 compute-0 sudo[32613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:37 compute-0 python3.9[32615]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct 06 13:29:37 compute-0 sudo[32613]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:38 compute-0 sudo[32765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npwazlhgmfkhtevnhzqnfywfsxewzflz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757378.0688233-329-79041085518839/AnsiballZ_command.py'
Oct 06 13:29:38 compute-0 sudo[32765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:38 compute-0 python3.9[32767]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct 06 13:29:39 compute-0 sudo[32765]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:40 compute-0 sudo[32918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlrpanjccojehcmdbzetqwdgtkfjxypl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757379.7250826-345-234332314327605/AnsiballZ_file.py'
Oct 06 13:29:40 compute-0 sudo[32918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:41 compute-0 python3.9[32920]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:29:41 compute-0 sudo[32918]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:41 compute-0 sudo[33070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhvhgkptcusyxcyohrarigwrstuanjjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757381.4238138-361-2906640062614/AnsiballZ_mount.py'
Oct 06 13:29:41 compute-0 sudo[33070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:42 compute-0 python3.9[33072]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct 06 13:29:42 compute-0 sudo[33070]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:43 compute-0 sudo[33222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eejhcriujqtpycvudhaorkmepsnzyyac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757382.8797715-417-205199759739412/AnsiballZ_file.py'
Oct 06 13:29:43 compute-0 sudo[33222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:43 compute-0 python3.9[33224]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:29:43 compute-0 sudo[33222]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:43 compute-0 sudo[33374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmthxdgmnwnklrjmjeqjkygdgcxrdiyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757383.6035395-433-231119581754893/AnsiballZ_stat.py'
Oct 06 13:29:43 compute-0 sudo[33374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:44 compute-0 python3.9[33376]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:29:44 compute-0 sudo[33374]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:44 compute-0 sshd-session[31303]: Connection closed by authenticating user root 193.32.162.157 port 52556 [preauth]
Oct 06 13:29:44 compute-0 sudo[33497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcbfajbcuxjydtrehavujydzkriawadd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757383.6035395-433-231119581754893/AnsiballZ_copy.py'
Oct 06 13:29:44 compute-0 sudo[33497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:44 compute-0 python3.9[33499]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757383.6035395-433-231119581754893/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2d30bf5e4294e3b1ccba3d399c329ed6db5e66b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:29:44 compute-0 sudo[33497]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:46 compute-0 sudo[33649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfwdhkghjyskoblbxnffrbmsolnzqgtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757385.9284396-487-246902799459957/AnsiballZ_getent.py'
Oct 06 13:29:46 compute-0 sudo[33649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:46 compute-0 python3.9[33651]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct 06 13:29:46 compute-0 sudo[33649]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:48 compute-0 sudo[33802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlgpromkliqieqgnxkdtkvgyjzeqrjmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757387.6390874-503-138083798691009/AnsiballZ_group.py'
Oct 06 13:29:48 compute-0 sudo[33802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:50 compute-0 python3.9[33804]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 06 13:29:50 compute-0 groupadd[33805]: group added to /etc/group: name=qemu, GID=107
Oct 06 13:29:50 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 13:29:50 compute-0 groupadd[33805]: group added to /etc/gshadow: name=qemu
Oct 06 13:29:50 compute-0 groupadd[33805]: new group: name=qemu, GID=107
Oct 06 13:29:50 compute-0 sudo[33802]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:50 compute-0 sudo[33961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqwvbntrymtyqpgueimijgbqphqdhobf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757390.3551424-519-23373107876960/AnsiballZ_user.py'
Oct 06 13:29:50 compute-0 sudo[33961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:51 compute-0 python3.9[33963]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 06 13:29:51 compute-0 useradd[33965]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Oct 06 13:29:51 compute-0 sudo[33961]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:51 compute-0 sudo[34121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsetngmetusblpyvfsyaivwnnsypkemt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757391.547607-535-186359255809060/AnsiballZ_getent.py'
Oct 06 13:29:51 compute-0 sudo[34121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:52 compute-0 python3.9[34123]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct 06 13:29:52 compute-0 sudo[34121]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:52 compute-0 sudo[34274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqcslvzwahuxjydipwwhxcnfqwqkaqln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757392.2586308-551-122730629910016/AnsiballZ_group.py'
Oct 06 13:29:52 compute-0 sudo[34274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:52 compute-0 python3.9[34276]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 06 13:29:52 compute-0 groupadd[34277]: group added to /etc/group: name=hugetlbfs, GID=42477
Oct 06 13:29:52 compute-0 groupadd[34277]: group added to /etc/gshadow: name=hugetlbfs
Oct 06 13:29:52 compute-0 groupadd[34277]: new group: name=hugetlbfs, GID=42477
Oct 06 13:29:52 compute-0 sudo[34274]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:53 compute-0 sudo[34432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztqvjtijiahevsxqlzdypdzkzhuzkhti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757393.1386416-569-68446391221563/AnsiballZ_file.py'
Oct 06 13:29:53 compute-0 sudo[34432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:53 compute-0 python3.9[34434]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct 06 13:29:53 compute-0 sudo[34432]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:54 compute-0 sudo[34584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecawdvjipaxxwcrsdsubdcevqwhwdhjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757393.99267-591-46309932983231/AnsiballZ_dnf.py'
Oct 06 13:29:54 compute-0 sudo[34584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:54 compute-0 python3.9[34586]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:29:56 compute-0 sudo[34584]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:56 compute-0 sudo[34737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oomvawkoaxlexizhlibosdykwouiyiet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757396.2395813-607-103050144889893/AnsiballZ_file.py'
Oct 06 13:29:56 compute-0 sudo[34737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:56 compute-0 python3.9[34739]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:29:56 compute-0 sudo[34737]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:57 compute-0 sudo[34889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txjfrealsrlmxzilkjqehzucvaplpwxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757396.9562004-623-121892636955588/AnsiballZ_stat.py'
Oct 06 13:29:57 compute-0 sudo[34889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:57 compute-0 python3.9[34891]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:29:57 compute-0 sudo[34889]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:57 compute-0 sudo[35012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijcksrecypgvpnuapunuodiszfoslssm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757396.9562004-623-121892636955588/AnsiballZ_copy.py'
Oct 06 13:29:57 compute-0 sudo[35012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:58 compute-0 python3.9[35014]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757396.9562004-623-121892636955588/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:29:58 compute-0 sudo[35012]: pam_unix(sudo:session): session closed for user root
Oct 06 13:29:58 compute-0 sudo[35164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acexxtigbmwndmvpqozgeasghmfuheki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757398.2978897-653-189006664894419/AnsiballZ_systemd.py'
Oct 06 13:29:58 compute-0 sudo[35164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:29:59 compute-0 python3.9[35166]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:29:59 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 06 13:29:59 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct 06 13:29:59 compute-0 kernel: Bridge firewalling registered
Oct 06 13:29:59 compute-0 systemd-modules-load[35170]: Inserted module 'br_netfilter'
Oct 06 13:29:59 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 06 13:29:59 compute-0 sudo[35164]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:00 compute-0 sudo[35325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdktvvewmewqblpwtrdjxxxbketcbrlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757399.6723087-669-170017528837512/AnsiballZ_stat.py'
Oct 06 13:30:00 compute-0 sudo[35325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:00 compute-0 python3.9[35327]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:30:00 compute-0 sudo[35325]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:00 compute-0 sudo[35448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzjulsfiizmhmpcumffkguqqoxbyigif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757399.6723087-669-170017528837512/AnsiballZ_copy.py'
Oct 06 13:30:00 compute-0 sudo[35448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:00 compute-0 python3.9[35450]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757399.6723087-669-170017528837512/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:30:00 compute-0 sudo[35448]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:01 compute-0 sudo[35600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcdwaslhvsaftmdvrmrqsfodyswpzabl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757401.351645-705-109301586510862/AnsiballZ_dnf.py'
Oct 06 13:30:01 compute-0 sudo[35600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:01 compute-0 python3.9[35602]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:30:05 compute-0 dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Oct 06 13:30:05 compute-0 dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Oct 06 13:30:05 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 06 13:30:05 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 06 13:30:05 compute-0 systemd[1]: Reloading.
Oct 06 13:30:06 compute-0 systemd-rc-local-generator[35663]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:30:06 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 06 13:30:06 compute-0 sudo[35600]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:07 compute-0 python3.9[36696]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:30:08 compute-0 python3.9[38048]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 06 13:30:09 compute-0 python3.9[38706]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:30:09 compute-0 sudo[39569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocrknjobfqhgesgctkklkijcltydqulz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757409.6006832-783-197628404548473/AnsiballZ_command.py'
Oct 06 13:30:09 compute-0 sudo[39569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:10 compute-0 python3.9[39600]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:30:10 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 06 13:30:10 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 06 13:30:10 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 06 13:30:10 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.491s CPU time.
Oct 06 13:30:10 compute-0 systemd[1]: run-r8898bb549695433393cfa687cb38df69.service: Deactivated successfully.
Oct 06 13:30:10 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 06 13:30:10 compute-0 sudo[39569]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:11 compute-0 sudo[40149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnvpdjuxeygdpfmvmwrtvimrbtvjzitz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757411.1024714-801-185076458714028/AnsiballZ_systemd.py'
Oct 06 13:30:11 compute-0 sudo[40149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:11 compute-0 python3.9[40151]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:30:11 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 06 13:30:11 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Oct 06 13:30:11 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 06 13:30:11 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 06 13:30:12 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 06 13:30:12 compute-0 sudo[40149]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:13 compute-0 python3.9[40312]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 06 13:30:15 compute-0 sudo[40462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bllictorntivmxvgtaxydoxaznljyfge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757414.8531694-915-104333686719373/AnsiballZ_systemd.py'
Oct 06 13:30:15 compute-0 sudo[40462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:15 compute-0 irqbalance[782]: Cannot change IRQ 26 affinity: Operation not permitted
Oct 06 13:30:15 compute-0 irqbalance[782]: IRQ 26 affinity is now unmanaged
Oct 06 13:30:15 compute-0 python3.9[40464]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:30:15 compute-0 systemd[1]: Reloading.
Oct 06 13:30:15 compute-0 systemd-rc-local-generator[40489]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:30:15 compute-0 sudo[40462]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:16 compute-0 sudo[40651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmtajtjtjekhyuqocbvorwyplljyruyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757415.972741-915-242111004044805/AnsiballZ_systemd.py'
Oct 06 13:30:16 compute-0 sudo[40651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:16 compute-0 python3.9[40653]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:30:16 compute-0 systemd[1]: Reloading.
Oct 06 13:30:16 compute-0 systemd-rc-local-generator[40685]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:30:16 compute-0 sudo[40651]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:17 compute-0 sudo[40840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iffvuphbrvlqxotccfqsrozwepxgkqul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757417.148852-947-185676780843778/AnsiballZ_command.py'
Oct 06 13:30:17 compute-0 sudo[40840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:17 compute-0 python3.9[40842]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:30:17 compute-0 sudo[40840]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:18 compute-0 sudo[40993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaslsmqycbideajtbbijuzgnkxdgpmqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757418.057386-963-84169081313194/AnsiballZ_command.py'
Oct 06 13:30:18 compute-0 sudo[40993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:18 compute-0 python3.9[40995]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:30:18 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct 06 13:30:18 compute-0 sudo[40993]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:19 compute-0 sudo[41146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxoumaorpffqgulvuyvcnsagkasnjwwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757418.8086786-979-262701261120387/AnsiballZ_command.py'
Oct 06 13:30:19 compute-0 sudo[41146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:19 compute-0 python3.9[41148]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:30:20 compute-0 sudo[41146]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:21 compute-0 sudo[41308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umlekaxdtssnvtuxuixofmtmnexynrzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757421.0646498-995-275411358902571/AnsiballZ_command.py'
Oct 06 13:30:21 compute-0 sudo[41308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:21 compute-0 python3.9[41310]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:30:21 compute-0 sudo[41308]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:22 compute-0 sudo[41461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oktrngeyselziarzoznhzfoljzzfocaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757421.801906-1011-11228954247389/AnsiballZ_systemd.py'
Oct 06 13:30:22 compute-0 sudo[41461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:22 compute-0 python3.9[41463]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:30:22 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 06 13:30:22 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Oct 06 13:30:22 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Oct 06 13:30:22 compute-0 systemd[1]: Starting Apply Kernel Variables...
Oct 06 13:30:22 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 06 13:30:22 compute-0 systemd[1]: Finished Apply Kernel Variables.
Oct 06 13:30:22 compute-0 sudo[41461]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:22 compute-0 sshd-session[28438]: Connection closed by 192.168.122.30 port 53688
Oct 06 13:30:22 compute-0 sshd-session[28435]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:30:22 compute-0 systemd-logind[789]: Session 10 logged out. Waiting for processes to exit.
Oct 06 13:30:22 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Oct 06 13:30:22 compute-0 systemd[1]: session-10.scope: Consumed 2min 17.696s CPU time.
Oct 06 13:30:22 compute-0 systemd-logind[789]: Removed session 10.
Oct 06 13:30:28 compute-0 sshd-session[41494]: Accepted publickey for zuul from 192.168.122.30 port 33996 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:30:28 compute-0 systemd-logind[789]: New session 11 of user zuul.
Oct 06 13:30:28 compute-0 systemd[1]: Started Session 11 of User zuul.
Oct 06 13:30:28 compute-0 sshd-session[41494]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:30:30 compute-0 python3.9[41647]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:30:31 compute-0 python3.9[41801]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:30:32 compute-0 sudo[41955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpuagakrqaxpzfdgkkntcjjxptppsckw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757432.052451-80-245747696081798/AnsiballZ_command.py'
Oct 06 13:30:32 compute-0 sudo[41955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:32 compute-0 python3.9[41957]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:30:32 compute-0 sudo[41955]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:33 compute-0 python3.9[42108]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:30:34 compute-0 sudo[42262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fixbjbfoqazcmgerecxkoslcrjyzgkob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757434.323232-120-269239377397129/AnsiballZ_setup.py'
Oct 06 13:30:34 compute-0 sudo[42262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:34 compute-0 python3.9[42264]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:30:35 compute-0 sudo[42262]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:35 compute-0 sudo[42346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynjagowkfrilzmorioraobtommmpoihh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757434.323232-120-269239377397129/AnsiballZ_dnf.py'
Oct 06 13:30:35 compute-0 sudo[42346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:35 compute-0 python3.9[42348]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:30:37 compute-0 sudo[42346]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:37 compute-0 sudo[42499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uigbnkzaypzwmurqpcnksyebuuthyqwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757437.2559721-144-213401620554820/AnsiballZ_setup.py'
Oct 06 13:30:37 compute-0 sudo[42499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:37 compute-0 python3.9[42501]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:30:38 compute-0 sudo[42499]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:38 compute-0 sudo[42670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omiexrlwejktqpwwhjqhltepyccpgnsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757438.41117-166-58518027276930/AnsiballZ_file.py'
Oct 06 13:30:38 compute-0 sudo[42670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:39 compute-0 python3.9[42672]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:30:39 compute-0 sudo[42670]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:39 compute-0 sudo[42822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nosmmaduuheakucerbtevvmovwtubxof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757439.2549999-182-217224661633500/AnsiballZ_command.py'
Oct 06 13:30:39 compute-0 sudo[42822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:39 compute-0 python3.9[42824]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:30:39 compute-0 podman[42825]: 2025-10-06 13:30:39.933235067 +0000 UTC m=+0.072768077 system refresh
Oct 06 13:30:39 compute-0 sudo[42822]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:40 compute-0 sudo[42985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmlaitpxxkjtxahptflouugibcssyhpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757440.184866-198-210820409455197/AnsiballZ_stat.py'
Oct 06 13:30:40 compute-0 sudo[42985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:40 compute-0 python3.9[42987]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:30:40 compute-0 sudo[42985]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:30:41 compute-0 sudo[43108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnqfdepajnkcfscurhpklgzjznqzybgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757440.184866-198-210820409455197/AnsiballZ_copy.py'
Oct 06 13:30:41 compute-0 sudo[43108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:41 compute-0 python3.9[43110]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757440.184866-198-210820409455197/.source.json follow=False _original_basename=podman_network_config.j2 checksum=4ff9cfb9783461c4be57a261add857b5ab953dc2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:30:41 compute-0 sudo[43108]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:42 compute-0 sudo[43260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fucpdvawoyxshwalyywnsyiwbikvinna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757441.7646935-228-199482296387503/AnsiballZ_stat.py'
Oct 06 13:30:42 compute-0 sudo[43260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:42 compute-0 python3.9[43262]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:30:42 compute-0 sudo[43260]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:42 compute-0 sudo[43383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkfjcyfnmmrzxfgcnxelifnzvfcujlxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757441.7646935-228-199482296387503/AnsiballZ_copy.py'
Oct 06 13:30:42 compute-0 sudo[43383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:43 compute-0 python3.9[43385]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757441.7646935-228-199482296387503/.source.conf follow=False _original_basename=registries.conf.j2 checksum=a0aff2918c593ab2ecb6e63575425d335b728816 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:30:43 compute-0 sudo[43383]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:43 compute-0 sudo[43535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovsvrwwdlqipcdmrghktoefhdlpgpksy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757443.2260797-260-69947912311060/AnsiballZ_ini_file.py'
Oct 06 13:30:43 compute-0 sudo[43535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:43 compute-0 python3.9[43537]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:30:44 compute-0 sudo[43535]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:44 compute-0 sudo[43687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naszvnnojdmrmkrlpptlbwcoswpqmfqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757444.1682441-260-272534859126561/AnsiballZ_ini_file.py'
Oct 06 13:30:44 compute-0 sudo[43687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:44 compute-0 python3.9[43689]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:30:44 compute-0 sudo[43687]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:45 compute-0 sudo[43839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wexifdsgfiijtuptipuvvxdagkkykspv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757444.9291978-260-36110840011327/AnsiballZ_ini_file.py'
Oct 06 13:30:45 compute-0 sudo[43839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:45 compute-0 python3.9[43841]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:30:45 compute-0 sudo[43839]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:45 compute-0 sudo[43991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ompcbpbzksdusisvzvwcsbrigzylvrsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757445.6335974-260-44674548690058/AnsiballZ_ini_file.py'
Oct 06 13:30:45 compute-0 sudo[43991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:46 compute-0 python3.9[43993]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:30:46 compute-0 sudo[43991]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:47 compute-0 python3.9[44143]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:30:47 compute-0 sudo[44296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrvdzczxqrjfmjbsqmmitcmpqswwnzyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757447.2890687-340-78440622106038/AnsiballZ_dnf.py'
Oct 06 13:30:47 compute-0 sudo[44296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:47 compute-0 python3.9[44298]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:30:48 compute-0 sudo[44296]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:49 compute-0 sudo[44449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrufzreyiqltpwcgbcssutbvsboksamj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757449.1654887-356-223069514322414/AnsiballZ_dnf.py'
Oct 06 13:30:49 compute-0 sudo[44449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:49 compute-0 python3.9[44451]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:30:51 compute-0 sudo[44449]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:52 compute-0 sudo[44609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbdnzqxzamacpkukkydoheidxguveoxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757451.7698984-376-10795199333512/AnsiballZ_dnf.py'
Oct 06 13:30:52 compute-0 sudo[44609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:52 compute-0 python3.9[44611]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:30:53 compute-0 sudo[44609]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:54 compute-0 sudo[44762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psbacqocnbaobcdvuljdqclcbgarzyrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757453.793056-394-110686886083044/AnsiballZ_dnf.py'
Oct 06 13:30:54 compute-0 sudo[44762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:54 compute-0 python3.9[44764]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:30:55 compute-0 sudo[44762]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:56 compute-0 sudo[44915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elxzchcrgikyqfabqvqgbsjybqxryqqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757456.0059211-416-22715242397656/AnsiballZ_dnf.py'
Oct 06 13:30:56 compute-0 sudo[44915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:56 compute-0 python3.9[44917]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:30:58 compute-0 sudo[44915]: pam_unix(sudo:session): session closed for user root
Oct 06 13:30:58 compute-0 sudo[45071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dprymalsgxbtebrfufgmwwifcrgkmits ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757458.2349112-432-112955107476713/AnsiballZ_dnf.py'
Oct 06 13:30:58 compute-0 sudo[45071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:30:58 compute-0 python3.9[45073]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:31:01 compute-0 sudo[45071]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:01 compute-0 sudo[45240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsokgphkohwmgtfolocndiqrhqfxtbuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757461.6136317-450-22585952066203/AnsiballZ_dnf.py'
Oct 06 13:31:01 compute-0 sudo[45240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:02 compute-0 python3.9[45242]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:31:03 compute-0 sudo[45240]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:03 compute-0 sudo[45393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihmbtmmwcfwqcugvvwpllklqiypvaglu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757463.5266721-468-155562599346806/AnsiballZ_dnf.py'
Oct 06 13:31:03 compute-0 sudo[45393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:04 compute-0 python3.9[45395]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:31:17 compute-0 sudo[45393]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:18 compute-0 sudo[45730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuzsejgykmapqhznppkbmmsdqunbavsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757477.803245-490-118940197589611/AnsiballZ_file.py'
Oct 06 13:31:18 compute-0 sudo[45730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:18 compute-0 python3.9[45732]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:31:18 compute-0 sudo[45730]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:19 compute-0 sudo[45905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwqkzdyydndiruwxxreppdqgsfudqrwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757478.5924711-506-102361703839939/AnsiballZ_stat.py'
Oct 06 13:31:19 compute-0 sudo[45905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:19 compute-0 python3.9[45907]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:31:19 compute-0 sudo[45905]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:19 compute-0 sudo[46028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsktfdfueefczkocafbpuosyjpavsowj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757478.5924711-506-102361703839939/AnsiballZ_copy.py'
Oct 06 13:31:19 compute-0 sudo[46028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:19 compute-0 python3.9[46030]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759757478.5924711-506-102361703839939/.source.json _original_basename=.o7tdlza6 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:31:19 compute-0 sudo[46028]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:20 compute-0 sudo[46180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gilogvdutlyujnzaxpsnzmamggbpcehk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757480.1744242-542-14036620210618/AnsiballZ_podman_image.py'
Oct 06 13:31:20 compute-0 sudo[46180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:20 compute-0 python3.9[46182]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 06 13:31:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3692109812-lower\x2dmapped.mount: Deactivated successfully.
Oct 06 13:31:28 compute-0 podman[46194]: 2025-10-06 13:31:28.428990778 +0000 UTC m=+7.421047384 image pull 0b62d011736892703306395462c684fe0dfe1473b0a9397423133e591c417adb 38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 06 13:31:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:28 compute-0 sudo[46180]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:29 compute-0 sudo[46490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbebggegnjebennspgitosilgvajoqsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757489.075828-560-76448880038054/AnsiballZ_podman_image.py'
Oct 06 13:31:29 compute-0 sudo[46490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:29 compute-0 python3.9[46492]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 06 13:31:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:31 compute-0 podman[46504]: 2025-10-06 13:31:31.219972254 +0000 UTC m=+1.477064742 image pull 2c4150b67f2803f56f4e9488a6a1d434787a7813c9b1fcb4aed975e77b886b52 38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Oct 06 13:31:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:31 compute-0 sudo[46490]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:32 compute-0 sudo[46756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcaehahqwtagejuvugpxeecjaloaphgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757491.799995-582-254563516647756/AnsiballZ_podman_image.py'
Oct 06 13:31:32 compute-0 sudo[46756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:32 compute-0 python3.9[46758]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 06 13:31:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:39 compute-0 podman[46769]: 2025-10-06 13:31:39.254506113 +0000 UTC m=+6.818386644 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 13:31:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:39 compute-0 sudo[46756]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:40 compute-0 sudo[47077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niojkspkuoasdnjvsbtdbrvzefcfmurp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757499.8546422-602-147012168359197/AnsiballZ_podman_image.py'
Oct 06 13:31:40 compute-0 sudo[47077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:40 compute-0 python3.9[47079]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 06 13:31:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:40 compute-0 podman[47091]: 2025-10-06 13:31:40.933512149 +0000 UTC m=+0.391012629 image pull a64e163f15f11e74249854aa8fb3596c33858cb805250d3c9483585fd4a94bdb 38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Oct 06 13:31:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:41 compute-0 sudo[47077]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:41 compute-0 sudo[47323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvarveucaafcsgluyenfqwafnmekgkng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757501.5547302-620-203143824347660/AnsiballZ_podman_image.py'
Oct 06 13:31:41 compute-0 sudo[47323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:42 compute-0 python3.9[47325]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 06 13:31:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:55 compute-0 podman[47337]: 2025-10-06 13:31:55.374147571 +0000 UTC m=+13.154456363 image pull 920af19e5030aa8d226c8406b11c407c332317d692c620edd5e546aed379868d 38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Oct 06 13:31:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:31:55 compute-0 sudo[47323]: pam_unix(sudo:session): session closed for user root
Oct 06 13:31:56 compute-0 sudo[47593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlybvymqifnuayexuvhowmhfbdqtwymb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757516.0615263-642-124638356038407/AnsiballZ_podman_image.py'
Oct 06 13:31:56 compute-0 sudo[47593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:31:56 compute-0 python3.9[47595]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.151:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 06 13:31:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:32:00 compute-0 podman[47607]: 2025-10-06 13:32:00.427444779 +0000 UTC m=+3.760231567 image pull c71138a4630fd42bc1bedbf40933f7e94ce3c9984a8ea2ae75e135f33b3e0297 38.102.83.151:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Oct 06 13:32:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:32:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:32:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:32:00 compute-0 sudo[47593]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:01 compute-0 sudo[47863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eptuklvkqbksztzaanvqhvbexgxvztpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757520.9224648-642-144442651682885/AnsiballZ_podman_image.py'
Oct 06 13:32:01 compute-0 sudo[47863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:01 compute-0 python3.9[47865]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 06 13:32:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:32:01 compute-0 anacron[3714]: Job `cron.weekly' started
Oct 06 13:32:01 compute-0 anacron[3714]: Job `cron.weekly' terminated
Oct 06 13:32:02 compute-0 podman[47880]: 2025-10-06 13:32:02.911863872 +0000 UTC m=+1.326652443 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct 06 13:32:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:32:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:32:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:32:03 compute-0 sudo[47863]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:03 compute-0 sshd-session[41497]: Connection closed by 192.168.122.30 port 33996
Oct 06 13:32:03 compute-0 sshd-session[41494]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:32:03 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Oct 06 13:32:03 compute-0 systemd[1]: session-11.scope: Consumed 1min 57.053s CPU time.
Oct 06 13:32:03 compute-0 systemd-logind[789]: Session 11 logged out. Waiting for processes to exit.
Oct 06 13:32:03 compute-0 systemd-logind[789]: Removed session 11.
Oct 06 13:32:08 compute-0 sshd-session[48029]: Accepted publickey for zuul from 192.168.122.30 port 32910 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:32:08 compute-0 systemd-logind[789]: New session 12 of user zuul.
Oct 06 13:32:08 compute-0 systemd[1]: Started Session 12 of User zuul.
Oct 06 13:32:08 compute-0 sshd-session[48029]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:32:10 compute-0 python3.9[48182]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:32:11 compute-0 sudo[48336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grtfbqbcndiejifffvoknqeobvzlldmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757530.582508-52-225495655599346/AnsiballZ_getent.py'
Oct 06 13:32:11 compute-0 sudo[48336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:11 compute-0 python3.9[48338]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct 06 13:32:11 compute-0 sudo[48336]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:11 compute-0 sudo[48489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhhzlttaslkuuytqyuglufbxdruieifa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757531.4556174-68-12076029105203/AnsiballZ_group.py'
Oct 06 13:32:11 compute-0 sudo[48489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:12 compute-0 python3.9[48491]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 06 13:32:12 compute-0 groupadd[48492]: group added to /etc/group: name=openvswitch, GID=42476
Oct 06 13:32:12 compute-0 groupadd[48492]: group added to /etc/gshadow: name=openvswitch
Oct 06 13:32:12 compute-0 groupadd[48492]: new group: name=openvswitch, GID=42476
Oct 06 13:32:12 compute-0 sudo[48489]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:12 compute-0 sudo[48647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itvqeamtltccloaiprettsixmxquvwqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757532.423413-84-222086292785779/AnsiballZ_user.py'
Oct 06 13:32:12 compute-0 sudo[48647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:13 compute-0 python3.9[48649]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 06 13:32:13 compute-0 useradd[48651]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Oct 06 13:32:13 compute-0 useradd[48651]: add 'openvswitch' to group 'hugetlbfs'
Oct 06 13:32:13 compute-0 useradd[48651]: add 'openvswitch' to shadow group 'hugetlbfs'
Oct 06 13:32:13 compute-0 sudo[48647]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:13 compute-0 sudo[48807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gupoxtfmdmhjjolcomwyxzpwcotoojsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757533.5163033-104-241721142252763/AnsiballZ_setup.py'
Oct 06 13:32:13 compute-0 sudo[48807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:14 compute-0 python3.9[48809]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:32:14 compute-0 sudo[48807]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:14 compute-0 sudo[48891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peeufmpguovasrtfkjmijmziqfbxdlbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757533.5163033-104-241721142252763/AnsiballZ_dnf.py'
Oct 06 13:32:14 compute-0 sudo[48891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:14 compute-0 python3.9[48893]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:32:16 compute-0 sudo[48891]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:17 compute-0 sudo[49054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evmkyqpllzmjwvcwrbxtisfxmoyleqmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757536.8070886-132-266016290927419/AnsiballZ_dnf.py'
Oct 06 13:32:17 compute-0 sudo[49054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:17 compute-0 python3.9[49056]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:32:28 compute-0 kernel: SELinux:  Converting 2727 SID table entries...
Oct 06 13:32:28 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:32:28 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 06 13:32:28 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:32:28 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:32:28 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:32:28 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:32:28 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:32:28 compute-0 groupadd[49079]: group added to /etc/group: name=unbound, GID=993
Oct 06 13:32:28 compute-0 groupadd[49079]: group added to /etc/gshadow: name=unbound
Oct 06 13:32:28 compute-0 groupadd[49079]: new group: name=unbound, GID=993
Oct 06 13:32:29 compute-0 useradd[49086]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Oct 06 13:32:29 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct 06 13:32:29 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct 06 13:32:30 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 06 13:32:30 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 06 13:32:30 compute-0 systemd[1]: Reloading.
Oct 06 13:32:30 compute-0 systemd-rc-local-generator[49584]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:32:30 compute-0 systemd-sysv-generator[49588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:32:31 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 06 13:32:31 compute-0 sudo[49054]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:31 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 06 13:32:31 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 06 13:32:31 compute-0 systemd[1]: run-r686175f6aba4473f89633d909aec0d98.service: Deactivated successfully.
Oct 06 13:32:32 compute-0 sudo[50156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgldzczzjvmtwfxqlfljmqgachtlsphd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757551.8445659-148-119215593827207/AnsiballZ_systemd.py'
Oct 06 13:32:32 compute-0 sudo[50156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:32 compute-0 python3.9[50158]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 06 13:32:32 compute-0 systemd[1]: Reloading.
Oct 06 13:32:32 compute-0 systemd-rc-local-generator[50185]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:32:32 compute-0 systemd-sysv-generator[50193]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:32:33 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Oct 06 13:32:33 compute-0 chown[50202]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct 06 13:32:33 compute-0 ovs-ctl[50207]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct 06 13:32:33 compute-0 ovs-ctl[50207]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct 06 13:32:33 compute-0 ovs-ctl[50207]: Starting ovsdb-server [  OK  ]
Oct 06 13:32:33 compute-0 ovs-vsctl[50256]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct 06 13:32:33 compute-0 ovs-vsctl[50275]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"6cb79b8b-7bef-432f-9e10-9690a1ce5aa4\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct 06 13:32:33 compute-0 ovs-ctl[50207]: Configuring Open vSwitch system IDs [  OK  ]
Oct 06 13:32:33 compute-0 ovs-ctl[50207]: Enabling remote OVSDB managers [  OK  ]
Oct 06 13:32:33 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Oct 06 13:32:33 compute-0 ovs-vsctl[50281]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 06 13:32:33 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct 06 13:32:33 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct 06 13:32:33 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct 06 13:32:33 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Oct 06 13:32:33 compute-0 ovs-ctl[50326]: Inserting openvswitch module [  OK  ]
Oct 06 13:32:34 compute-0 ovs-ctl[50295]: Starting ovs-vswitchd [  OK  ]
Oct 06 13:32:34 compute-0 ovs-vsctl[50345]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 06 13:32:34 compute-0 ovs-ctl[50295]: Enabling remote OVSDB managers [  OK  ]
Oct 06 13:32:34 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct 06 13:32:34 compute-0 systemd[1]: Starting Open vSwitch...
Oct 06 13:32:34 compute-0 systemd[1]: Finished Open vSwitch.
Oct 06 13:32:34 compute-0 sudo[50156]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:35 compute-0 python3.9[50497]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:32:35 compute-0 sudo[50647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhzzgsocfmuqnpxmjgctadihopzejssl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757555.264742-184-46605010222562/AnsiballZ_sefcontext.py'
Oct 06 13:32:35 compute-0 sudo[50647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:36 compute-0 python3.9[50649]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct 06 13:32:37 compute-0 kernel: SELinux:  Converting 2741 SID table entries...
Oct 06 13:32:37 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:32:37 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 06 13:32:37 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:32:37 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:32:37 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:32:37 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:32:37 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:32:37 compute-0 sudo[50647]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:38 compute-0 python3.9[50805]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:32:39 compute-0 sudo[50961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnktynjehttnvmhevvhuzdyzemfvqwlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757558.8889534-220-233119466723450/AnsiballZ_dnf.py'
Oct 06 13:32:39 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct 06 13:32:39 compute-0 sudo[50961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:39 compute-0 python3.9[50963]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:32:40 compute-0 sudo[50961]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:41 compute-0 sudo[51114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqrtuozjbgraaosgrrguaouynounltmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757560.7935781-236-90390271166827/AnsiballZ_command.py'
Oct 06 13:32:41 compute-0 sudo[51114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:41 compute-0 python3.9[51116]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:32:42 compute-0 sudo[51114]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:43 compute-0 sudo[51401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-intpepukvctfmiwhdrawvulctdpoqfuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757562.593902-252-129932826375381/AnsiballZ_file.py'
Oct 06 13:32:43 compute-0 sudo[51401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:43 compute-0 python3.9[51403]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 06 13:32:43 compute-0 sudo[51401]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:44 compute-0 python3.9[51553]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:32:44 compute-0 sudo[51705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lryqfsflvhrvntauoteeebxrslxggtxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757564.243174-284-88113694700281/AnsiballZ_dnf.py'
Oct 06 13:32:44 compute-0 sudo[51705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:44 compute-0 python3.9[51707]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:32:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 06 13:32:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 06 13:32:46 compute-0 systemd[1]: Reloading.
Oct 06 13:32:46 compute-0 systemd-rc-local-generator[51748]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:32:46 compute-0 systemd-sysv-generator[51751]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:32:47 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 06 13:32:47 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 06 13:32:47 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 06 13:32:47 compute-0 systemd[1]: run-rdb7e0236c3e24db4afa7a7031093cd8b.service: Deactivated successfully.
Oct 06 13:32:47 compute-0 sudo[51705]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:47 compute-0 sudo[52024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukanfxhzdapucctupvxrerbzwkanxjua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757567.5803063-300-215582761975213/AnsiballZ_systemd.py'
Oct 06 13:32:47 compute-0 sudo[52024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:48 compute-0 python3.9[52026]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:32:48 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 06 13:32:48 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Oct 06 13:32:48 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Oct 06 13:32:48 compute-0 systemd[1]: Stopping Network Manager...
Oct 06 13:32:48 compute-0 NetworkManager[3953]: <info>  [1759757568.2600] caught SIGTERM, shutting down normally.
Oct 06 13:32:48 compute-0 NetworkManager[3953]: <info>  [1759757568.2614] dhcp4 (eth0): canceled DHCP transaction
Oct 06 13:32:48 compute-0 NetworkManager[3953]: <info>  [1759757568.2614] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 06 13:32:48 compute-0 NetworkManager[3953]: <info>  [1759757568.2615] dhcp4 (eth0): state changed no lease
Oct 06 13:32:48 compute-0 NetworkManager[3953]: <info>  [1759757568.2617] manager: NetworkManager state is now CONNECTED_SITE
Oct 06 13:32:48 compute-0 NetworkManager[3953]: <info>  [1759757568.2692] exiting (success)
Oct 06 13:32:48 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 06 13:32:48 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 06 13:32:48 compute-0 systemd[1]: Stopped Network Manager.
Oct 06 13:32:48 compute-0 systemd[1]: NetworkManager.service: Consumed 12.873s CPU time, 4.0M memory peak, read 0B from disk, written 17.0K to disk.
Oct 06 13:32:48 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 06 13:32:48 compute-0 systemd[1]: Starting Network Manager...
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.3320] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:58776300-6201-422f-aac2-b277cfa9c8d1)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.3321] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.3421] manager[0x557f1f2e1090]: monitoring kernel firmware directory '/lib/firmware'.
Oct 06 13:32:48 compute-0 systemd[1]: Starting Hostname Service...
Oct 06 13:32:48 compute-0 systemd[1]: Started Hostname Service.
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4253] hostname: hostname: using hostnamed
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4254] hostname: static hostname changed from (none) to "compute-0"
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4258] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4262] manager[0x557f1f2e1090]: rfkill: Wi-Fi hardware radio set enabled
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4263] manager[0x557f1f2e1090]: rfkill: WWAN hardware radio set enabled
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4280] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4287] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4288] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4289] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4290] manager: Networking is enabled by state file
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4293] settings: Loaded settings plugin: keyfile (internal)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4296] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4318] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4327] dhcp: init: Using DHCP client 'internal'
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4330] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4334] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4340] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4346] device (lo): Activation: starting connection 'lo' (d21fa287-6890-4cd6-bfdf-64464e1b01d1)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4352] device (eth0): carrier: link connected
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4356] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4360] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4361] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4367] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4373] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4377] device (eth1): carrier: link connected
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4381] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4385] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (ee89cfcf-f0a3-5a6c-9105-8e4b6101123c) (indicated)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4386] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4392] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4398] device (eth1): Activation: starting connection 'ci-private-network' (ee89cfcf-f0a3-5a6c-9105-8e4b6101123c)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4404] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 06 13:32:48 compute-0 systemd[1]: Started Network Manager.
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4410] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4412] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4414] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4415] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4418] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4422] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4424] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4429] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4435] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4438] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4455] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4466] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4472] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4474] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4481] device (lo): Activation: successful, device activated.
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4488] dhcp4 (eth0): state changed new lease, address=38.102.83.150
Oct 06 13:32:48 compute-0 systemd[1]: Starting Network Manager Wait Online...
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4497] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 06 13:32:48 compute-0 sudo[52024]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4863] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4875] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4881] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4883] manager: NetworkManager state is now CONNECTED_LOCAL
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4886] device (eth1): Activation: successful, device activated.
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4929] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4931] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4935] manager: NetworkManager state is now CONNECTED_SITE
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4940] device (eth0): Activation: successful, device activated.
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4947] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 06 13:32:48 compute-0 NetworkManager[52035]: <info>  [1759757568.4951] manager: startup complete
Oct 06 13:32:48 compute-0 systemd[1]: Finished Network Manager Wait Online.
Oct 06 13:32:48 compute-0 sudo[52250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kswwwnzoygcqzogonejehnyqyefclkem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757568.6240814-316-210823749238279/AnsiballZ_dnf.py'
Oct 06 13:32:48 compute-0 sudo[52250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:49 compute-0 python3.9[52252]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:32:53 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 06 13:32:53 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 06 13:32:53 compute-0 systemd[1]: Reloading.
Oct 06 13:32:54 compute-0 systemd-rc-local-generator[52300]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:32:54 compute-0 systemd-sysv-generator[52303]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:32:54 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 06 13:32:54 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 06 13:32:54 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 06 13:32:54 compute-0 systemd[1]: run-r09774c1c11f640558aa69c7314d1528f.service: Deactivated successfully.
Oct 06 13:32:54 compute-0 sudo[52250]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:55 compute-0 sudo[52714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyxlkitngyyumtdzyocrrxuvqmsppixf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757575.2613893-340-267000222182441/AnsiballZ_stat.py'
Oct 06 13:32:55 compute-0 sudo[52714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:55 compute-0 python3.9[52716]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:32:55 compute-0 sudo[52714]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:56 compute-0 sudo[52866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ispmgppmkhhezhlyoxomfwmqmwnelbnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757576.044524-358-235286777860979/AnsiballZ_ini_file.py'
Oct 06 13:32:56 compute-0 sudo[52866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:56 compute-0 python3.9[52868]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:32:56 compute-0 sudo[52866]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:58 compute-0 sudo[53020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbdiwkqftxpumkinxnnvyldwzrdknfom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757578.0019188-378-168285018791620/AnsiballZ_ini_file.py'
Oct 06 13:32:58 compute-0 sudo[53020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:58 compute-0 python3.9[53022]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:32:58 compute-0 sudo[53020]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:58 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 06 13:32:59 compute-0 sudo[53172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwywupubcsjcslnwxrakciwetzgxybbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757578.7415233-378-220143271099585/AnsiballZ_ini_file.py'
Oct 06 13:32:59 compute-0 sudo[53172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:32:59 compute-0 python3.9[53174]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:32:59 compute-0 sudo[53172]: pam_unix(sudo:session): session closed for user root
Oct 06 13:32:59 compute-0 sudo[53324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmzajbdorzyirzlvjimfztirhkswovnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757579.567217-408-31391296107975/AnsiballZ_ini_file.py'
Oct 06 13:32:59 compute-0 sudo[53324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:00 compute-0 python3.9[53326]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:33:00 compute-0 sudo[53324]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:00 compute-0 sudo[53476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcsbuhuuxbwlboyexsxnkswhuayihnwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757580.233751-408-224708413627502/AnsiballZ_ini_file.py'
Oct 06 13:33:00 compute-0 sudo[53476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:00 compute-0 python3.9[53478]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:33:00 compute-0 sudo[53476]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:01 compute-0 sudo[53628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjbibgqrmibxxfsnoalfyopzrqnnzrqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757581.1348877-438-171806702271361/AnsiballZ_stat.py'
Oct 06 13:33:01 compute-0 sudo[53628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:01 compute-0 python3.9[53630]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:33:01 compute-0 sudo[53628]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:02 compute-0 sudo[53751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azvkapvppgkoqfajspxhyuyueijfhrma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757581.1348877-438-171806702271361/AnsiballZ_copy.py'
Oct 06 13:33:02 compute-0 sudo[53751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:02 compute-0 python3.9[53753]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757581.1348877-438-171806702271361/.source _original_basename=.53xj5zdk follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:33:02 compute-0 sudo[53751]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:02 compute-0 sudo[53903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhwphyklhrwyyuvxtmoayakyklrmybky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757582.559346-468-16939492718987/AnsiballZ_file.py'
Oct 06 13:33:02 compute-0 sudo[53903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:03 compute-0 python3.9[53905]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:33:03 compute-0 sudo[53903]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:03 compute-0 sudo[54055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaaxdejubtaquxgwuiexpvkupfjqxbsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757583.2630136-484-265980914116547/AnsiballZ_edpm_os_net_config_mappings.py'
Oct 06 13:33:03 compute-0 sudo[54055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:03 compute-0 python3.9[54057]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct 06 13:33:03 compute-0 sudo[54055]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:04 compute-0 sudo[54207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmxlkvgxtrskuqsihhxmphkhjrjvylnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757584.146091-502-235753021847781/AnsiballZ_file.py'
Oct 06 13:33:04 compute-0 sudo[54207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:04 compute-0 python3.9[54209]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:33:04 compute-0 sudo[54207]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:05 compute-0 sudo[54359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjhzgzopmbmoasnkhiqkshxmvmktyhvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757585.0553758-522-256192616948898/AnsiballZ_stat.py'
Oct 06 13:33:05 compute-0 sudo[54359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:05 compute-0 sudo[54359]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:06 compute-0 sudo[54482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shjqzksrfvgbxqzattgsucjjhflngjcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757585.0553758-522-256192616948898/AnsiballZ_copy.py'
Oct 06 13:33:06 compute-0 sudo[54482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:06 compute-0 sudo[54482]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:07 compute-0 sudo[54634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txafgdnagvoccpnffnuysrzunfndtgjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757586.5943964-552-24025170787495/AnsiballZ_slurp.py'
Oct 06 13:33:07 compute-0 sudo[54634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:07 compute-0 python3.9[54636]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct 06 13:33:07 compute-0 sudo[54634]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:08 compute-0 sudo[54809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lakrmmnnnooqhjaegjwmvqwphmqcqcrh ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757587.5756142-570-111672970878352/async_wrapper.py j109813998894 300 /home/zuul/.ansible/tmp/ansible-tmp-1759757587.5756142-570-111672970878352/AnsiballZ_edpm_os_net_config.py _'
Oct 06 13:33:08 compute-0 sudo[54809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:08 compute-0 ansible-async_wrapper.py[54811]: Invoked with j109813998894 300 /home/zuul/.ansible/tmp/ansible-tmp-1759757587.5756142-570-111672970878352/AnsiballZ_edpm_os_net_config.py _
Oct 06 13:33:08 compute-0 ansible-async_wrapper.py[54814]: Starting module and watcher
Oct 06 13:33:08 compute-0 ansible-async_wrapper.py[54814]: Start watching 54815 (300)
Oct 06 13:33:08 compute-0 ansible-async_wrapper.py[54815]: Start module (54815)
Oct 06 13:33:08 compute-0 ansible-async_wrapper.py[54811]: Return async_wrapper task started.
Oct 06 13:33:08 compute-0 sudo[54809]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:08 compute-0 python3.9[54816]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct 06 13:33:09 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct 06 13:33:09 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct 06 13:33:09 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct 06 13:33:09 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct 06 13:33:09 compute-0 kernel: cfg80211: failed to load regulatory.db
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.1500] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.1522] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2107] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2108] audit: op="connection-add" uuid="963e301f-cccb-484b-8878-b62627d7bb22" name="br-ex-br" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2130] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2132] audit: op="connection-add" uuid="e0cee6b6-06b3-4bcc-8cb7-66ee8ecf260d" name="br-ex-port" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2146] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2147] audit: op="connection-add" uuid="b4a4dd93-ea53-45d6-9c89-fddef234daac" name="eth1-port" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2162] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2163] audit: op="connection-add" uuid="bc1bc22f-da12-4aef-924a-01c478d8f331" name="vlan20-port" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2176] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2177] audit: op="connection-add" uuid="fdfc63b3-6a8e-42fc-89e3-3316d97a5c2d" name="vlan21-port" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2189] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2191] audit: op="connection-add" uuid="cea5431c-e67f-4ffb-a99b-14ed6c5d8cc6" name="vlan22-port" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2214] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2233] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2234] audit: op="connection-add" uuid="d9656244-8d0a-464a-94c5-8bd29816d698" name="br-ex-if" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2277] audit: op="connection-update" uuid="ee89cfcf-f0a3-5a6c-9105-8e4b6101123c" name="ci-private-network" args="connection.controller,connection.master,connection.port-type,connection.slave-type,connection.timestamp,ovs-external-ids.data,ovs-interface.type,ipv6.addresses,ipv6.routes,ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.dns,ipv6.method,ipv4.addresses,ipv4.routes,ipv4.routing-rules,ipv4.dns,ipv4.method,ipv4.never-default" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2295] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2296] audit: op="connection-add" uuid="b7806448-1ac9-4d7c-951f-7712b9a80317" name="vlan20-if" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2314] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2315] audit: op="connection-add" uuid="604e260d-12d6-4b64-ba8c-5e45b6ebb7e3" name="vlan21-if" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2341] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2344] audit: op="connection-add" uuid="6c945d4e-d055-4257-ad05-ab661eebc4ca" name="vlan22-if" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2363] audit: op="connection-delete" uuid="aa3ec181-3406-36ec-85f4-51c2c84686c7" name="Wired connection 1" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2376] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2385] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2388] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (963e301f-cccb-484b-8878-b62627d7bb22)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2388] audit: op="connection-activate" uuid="963e301f-cccb-484b-8878-b62627d7bb22" name="br-ex-br" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2390] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2396] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2399] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (e0cee6b6-06b3-4bcc-8cb7-66ee8ecf260d)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2400] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2405] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2408] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (b4a4dd93-ea53-45d6-9c89-fddef234daac)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2410] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2415] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2418] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (bc1bc22f-da12-4aef-924a-01c478d8f331)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2420] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2425] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2428] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (fdfc63b3-6a8e-42fc-89e3-3316d97a5c2d)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2430] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2434] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2437] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (cea5431c-e67f-4ffb-a99b-14ed6c5d8cc6)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2438] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2440] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2441] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2447] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2450] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2454] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (d9656244-8d0a-464a-94c5-8bd29816d698)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2454] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2456] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2458] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2458] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2460] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2469] device (eth1): disconnecting for new activation request.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2469] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2472] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2473] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2474] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2476] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2479] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2482] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (b7806448-1ac9-4d7c-951f-7712b9a80317)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2483] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2485] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2486] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2487] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2490] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2494] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2498] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (604e260d-12d6-4b64-ba8c-5e45b6ebb7e3)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2498] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2501] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2503] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2504] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2506] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2510] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2513] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (6c945d4e-d055-4257-ad05-ab661eebc4ca)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2514] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2516] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2518] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2519] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2520] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2533] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2535] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2538] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2539] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2551] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2556] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2560] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2563] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2564] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2568] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 kernel: ovs-system: entered promiscuous mode
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2571] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2575] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2576] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2582] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2586] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2598] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2600] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 systemd-udevd[54823]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 13:33:11 compute-0 kernel: Timeout policy base is empty
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2605] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2611] dhcp4 (eth0): canceled DHCP transaction
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2611] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2611] dhcp4 (eth0): state changed no lease
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2613] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2632] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2640] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54817 uid=0 result="fail" reason="Device is not activated"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2643] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2649] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2655] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2659] dhcp4 (eth0): state changed new lease, address=38.102.83.150
Oct 06 13:33:11 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2746] device (eth1): disconnecting for new activation request.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2747] audit: op="connection-activate" uuid="ee89cfcf-f0a3-5a6c-9105-8e4b6101123c" name="ci-private-network" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2775] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54817 uid=0 result="success"
Oct 06 13:33:11 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.2879] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3033] device (eth1): Activation: starting connection 'ci-private-network' (ee89cfcf-f0a3-5a6c-9105-8e4b6101123c)
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3042] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3071] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3078] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 kernel: br-ex: entered promiscuous mode
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3109] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3112] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3115] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3115] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3116] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3117] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3118] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3124] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3128] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3130] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3133] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3135] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3137] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3140] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3142] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3145] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3147] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3149] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3155] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3158] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct 06 13:33:11 compute-0 kernel: vlan22: entered promiscuous mode
Oct 06 13:33:11 compute-0 systemd-udevd[54822]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3257] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3264] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3272] device (eth1): Activation: successful, device activated.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3336] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct 06 13:33:11 compute-0 kernel: vlan21: entered promiscuous mode
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3357] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3389] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3391] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3399] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 06 13:33:11 compute-0 kernel: vlan20: entered promiscuous mode
Oct 06 13:33:11 compute-0 systemd-udevd[54821]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3471] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3485] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3509] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3528] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3536] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3540] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3547] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3580] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3581] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3587] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3629] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3653] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3711] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3713] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 06 13:33:11 compute-0 NetworkManager[52035]: <info>  [1759757591.3719] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 06 13:33:12 compute-0 sudo[55147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwxdxvibwkfakonfkyuwqietrfgbahgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757591.5615425-570-210361735265294/AnsiballZ_async_status.py'
Oct 06 13:33:12 compute-0 sudo[55147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:12 compute-0 python3.9[55149]: ansible-ansible.legacy.async_status Invoked with jid=j109813998894.54811 mode=status _async_dir=/root/.ansible_async
Oct 06 13:33:12 compute-0 sudo[55147]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:12 compute-0 NetworkManager[52035]: <info>  [1759757592.4838] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54817 uid=0 result="success"
Oct 06 13:33:12 compute-0 NetworkManager[52035]: <info>  [1759757592.6791] checkpoint[0x557f1f2b7950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct 06 13:33:12 compute-0 NetworkManager[52035]: <info>  [1759757592.6793] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54817 uid=0 result="success"
Oct 06 13:33:13 compute-0 NetworkManager[52035]: <info>  [1759757593.0655] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54817 uid=0 result="success"
Oct 06 13:33:13 compute-0 NetworkManager[52035]: <info>  [1759757593.0675] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54817 uid=0 result="success"
Oct 06 13:33:13 compute-0 NetworkManager[52035]: <info>  [1759757593.3432] audit: op="networking-control" arg="global-dns-configuration" pid=54817 uid=0 result="success"
Oct 06 13:33:13 compute-0 NetworkManager[52035]: <info>  [1759757593.3465] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct 06 13:33:13 compute-0 NetworkManager[52035]: <info>  [1759757593.3503] audit: op="networking-control" arg="global-dns-configuration" pid=54817 uid=0 result="success"
Oct 06 13:33:13 compute-0 NetworkManager[52035]: <info>  [1759757593.4218] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54817 uid=0 result="success"
Oct 06 13:33:13 compute-0 ansible-async_wrapper.py[54814]: 54815 still running (300)
Oct 06 13:33:13 compute-0 NetworkManager[52035]: <info>  [1759757593.5817] checkpoint[0x557f1f2b7a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct 06 13:33:13 compute-0 NetworkManager[52035]: <info>  [1759757593.5822] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54817 uid=0 result="success"
Oct 06 13:33:13 compute-0 ansible-async_wrapper.py[54815]: Module complete (54815)
Oct 06 13:33:15 compute-0 sudo[55253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jylflrnxrgopiylenvrewwdeajocokvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757591.5615425-570-210361735265294/AnsiballZ_async_status.py'
Oct 06 13:33:15 compute-0 sudo[55253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:15 compute-0 python3.9[55255]: ansible-ansible.legacy.async_status Invoked with jid=j109813998894.54811 mode=status _async_dir=/root/.ansible_async
Oct 06 13:33:15 compute-0 sudo[55253]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:16 compute-0 sudo[55353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mximvjveecnlndbmsgmscrpfvmqurkaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757591.5615425-570-210361735265294/AnsiballZ_async_status.py'
Oct 06 13:33:16 compute-0 sudo[55353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:16 compute-0 python3.9[55355]: ansible-ansible.legacy.async_status Invoked with jid=j109813998894.54811 mode=cleanup _async_dir=/root/.ansible_async
Oct 06 13:33:16 compute-0 sudo[55353]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:16 compute-0 sudo[55505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aepinhrlydjmrhpibteivewofclkbosq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757596.6552212-619-188323784939605/AnsiballZ_stat.py'
Oct 06 13:33:16 compute-0 sudo[55505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:17 compute-0 python3.9[55507]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:33:17 compute-0 sudo[55505]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:17 compute-0 sudo[55628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjsixqravwnjxxaigtetpwgkkzmdcaya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757596.6552212-619-188323784939605/AnsiballZ_copy.py'
Oct 06 13:33:17 compute-0 sudo[55628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:17 compute-0 python3.9[55630]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757596.6552212-619-188323784939605/.source.returncode _original_basename=.6kac2wb8 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:33:17 compute-0 sudo[55628]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:18 compute-0 sudo[55780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqesmcnhnpqlcgdcsdymycezesuuiqra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757597.9945025-651-203786993105662/AnsiballZ_stat.py'
Oct 06 13:33:18 compute-0 sudo[55780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:18 compute-0 ansible-async_wrapper.py[54814]: Done in kid B.
Oct 06 13:33:18 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 06 13:33:18 compute-0 python3.9[55782]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:33:18 compute-0 sudo[55780]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:18 compute-0 sudo[55905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bffiplrjslvznwowthgcsphtflgdcbji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757597.9945025-651-203786993105662/AnsiballZ_copy.py'
Oct 06 13:33:18 compute-0 sudo[55905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:19 compute-0 python3.9[55908]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757597.9945025-651-203786993105662/.source.cfg _original_basename=.os_1wp2h follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:33:19 compute-0 sudo[55905]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:19 compute-0 sudo[56058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxztgwhdmdxzpbwblbelcfmmnfiichee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757599.365056-681-260175777301591/AnsiballZ_systemd.py'
Oct 06 13:33:19 compute-0 sudo[56058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:19 compute-0 python3.9[56060]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:33:20 compute-0 systemd[1]: Reloading Network Manager...
Oct 06 13:33:20 compute-0 NetworkManager[52035]: <info>  [1759757600.1082] audit: op="reload" arg="0" pid=56064 uid=0 result="success"
Oct 06 13:33:20 compute-0 NetworkManager[52035]: <info>  [1759757600.1090] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct 06 13:33:20 compute-0 systemd[1]: Reloaded Network Manager.
Oct 06 13:33:20 compute-0 sudo[56058]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:20 compute-0 sshd-session[48032]: Connection closed by 192.168.122.30 port 32910
Oct 06 13:33:20 compute-0 sshd-session[48029]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:33:20 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Oct 06 13:33:20 compute-0 systemd[1]: session-12.scope: Consumed 54.367s CPU time.
Oct 06 13:33:20 compute-0 systemd-logind[789]: Session 12 logged out. Waiting for processes to exit.
Oct 06 13:33:20 compute-0 systemd-logind[789]: Removed session 12.
Oct 06 13:33:25 compute-0 sshd-session[56096]: Accepted publickey for zuul from 192.168.122.30 port 51670 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:33:25 compute-0 systemd-logind[789]: New session 13 of user zuul.
Oct 06 13:33:25 compute-0 systemd[1]: Started Session 13 of User zuul.
Oct 06 13:33:25 compute-0 sshd-session[56096]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:33:27 compute-0 python3.9[56249]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:33:28 compute-0 python3.9[56403]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:33:29 compute-0 python3.9[56593]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:33:29 compute-0 sshd-session[56099]: Connection closed by 192.168.122.30 port 51670
Oct 06 13:33:29 compute-0 sshd-session[56096]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:33:29 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Oct 06 13:33:29 compute-0 systemd[1]: session-13.scope: Consumed 2.654s CPU time.
Oct 06 13:33:29 compute-0 systemd-logind[789]: Session 13 logged out. Waiting for processes to exit.
Oct 06 13:33:29 compute-0 systemd-logind[789]: Removed session 13.
Oct 06 13:33:30 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 06 13:33:36 compute-0 sshd-session[56622]: Accepted publickey for zuul from 192.168.122.30 port 45622 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:33:36 compute-0 systemd-logind[789]: New session 14 of user zuul.
Oct 06 13:33:36 compute-0 systemd[1]: Started Session 14 of User zuul.
Oct 06 13:33:36 compute-0 sshd-session[56622]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:33:37 compute-0 python3.9[56775]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:33:38 compute-0 python3.9[56929]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:33:39 compute-0 sudo[57084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywjhxrgwkhyqczqucwnfunzxartsgjio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757618.7309825-60-168304470450971/AnsiballZ_setup.py'
Oct 06 13:33:39 compute-0 sudo[57084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:39 compute-0 python3.9[57086]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:33:39 compute-0 sudo[57084]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:40 compute-0 sudo[57168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knynzvultcwgnzudavrwdwiiaqdtbufy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757618.7309825-60-168304470450971/AnsiballZ_dnf.py'
Oct 06 13:33:40 compute-0 sudo[57168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:40 compute-0 python3.9[57170]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:33:41 compute-0 sudo[57168]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:41 compute-0 sudo[57322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaugespgoxxsiuzshvmivjjpnjmjsgiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757621.6156995-84-177380976648382/AnsiballZ_setup.py'
Oct 06 13:33:41 compute-0 sudo[57322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:42 compute-0 python3.9[57324]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:33:43 compute-0 sudo[57322]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:44 compute-0 sudo[57513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrisalexgnrnzktqaylxvikhttprtjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757624.1012933-106-251481597635381/AnsiballZ_file.py'
Oct 06 13:33:44 compute-0 sudo[57513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:44 compute-0 python3.9[57515]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:33:44 compute-0 sudo[57513]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:45 compute-0 sudo[57665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmvhiizexvahwcvbpxjfvcwxrclghtci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757625.107802-122-54291994306211/AnsiballZ_command.py'
Oct 06 13:33:45 compute-0 sudo[57665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:45 compute-0 python3.9[57667]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:33:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:33:46 compute-0 sudo[57665]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:46 compute-0 sudo[57827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npymguxbsbdxxwbzmyannuzmlgugyrsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757626.2125356-138-226009103391972/AnsiballZ_stat.py'
Oct 06 13:33:46 compute-0 sudo[57827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:46 compute-0 python3.9[57829]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:33:46 compute-0 sudo[57827]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:47 compute-0 sudo[57905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnslatgxkffoocgpkhpxxrhciarxtcvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757626.2125356-138-226009103391972/AnsiballZ_file.py'
Oct 06 13:33:47 compute-0 sudo[57905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:47 compute-0 python3.9[57907]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:33:47 compute-0 sudo[57905]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:48 compute-0 sudo[58057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahnurqtcybrlnhrkxdajkfqplrulyxaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757627.704029-162-269596474176228/AnsiballZ_stat.py'
Oct 06 13:33:48 compute-0 sudo[58057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:48 compute-0 python3.9[58059]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:33:48 compute-0 sudo[58057]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:48 compute-0 sudo[58135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xznkxmbrmmkretsjpmbrbrkethzvzfjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757627.704029-162-269596474176228/AnsiballZ_file.py'
Oct 06 13:33:48 compute-0 sudo[58135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:48 compute-0 python3.9[58137]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:33:48 compute-0 sudo[58135]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:49 compute-0 sudo[58287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxzzdwuezgacqlklwntdigomxafdnbai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757629.2879274-188-109635626062070/AnsiballZ_ini_file.py'
Oct 06 13:33:49 compute-0 sudo[58287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:50 compute-0 python3.9[58289]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:33:50 compute-0 sudo[58287]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:50 compute-0 sudo[58439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojkiwlvziuaszcjxeyauipdfbqyolowp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757630.2981107-188-242101410363363/AnsiballZ_ini_file.py'
Oct 06 13:33:50 compute-0 sudo[58439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:50 compute-0 python3.9[58441]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:33:50 compute-0 sudo[58439]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:51 compute-0 sudo[58591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crmwjpnxsvcruisuwsmgclvvjqfvfihq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757631.0120804-188-264854505492583/AnsiballZ_ini_file.py'
Oct 06 13:33:51 compute-0 sudo[58591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:51 compute-0 python3.9[58593]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:33:51 compute-0 sudo[58591]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:52 compute-0 sudo[58743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmwpxwayuckbsxuzwvbqaqolmwaqysqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757631.7891722-188-188461267615691/AnsiballZ_ini_file.py'
Oct 06 13:33:52 compute-0 sudo[58743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:52 compute-0 python3.9[58745]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:33:52 compute-0 sudo[58743]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:52 compute-0 sudo[58895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndmxqeikxpdseqejbllmggkijrlskpyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757632.6288218-250-157541181193201/AnsiballZ_dnf.py'
Oct 06 13:33:52 compute-0 sudo[58895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:53 compute-0 python3.9[58897]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:33:54 compute-0 sudo[58895]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:55 compute-0 sudo[59048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkjuonwvdriiqxzciaoecisjvgvquomd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757634.8118358-272-79793328357766/AnsiballZ_setup.py'
Oct 06 13:33:55 compute-0 sudo[59048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:55 compute-0 python3.9[59050]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:33:55 compute-0 sudo[59048]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:56 compute-0 sudo[59202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvxxlhvmhotblowbbiniyttfokgqrhxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757635.690609-288-131723288386034/AnsiballZ_stat.py'
Oct 06 13:33:56 compute-0 sudo[59202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:56 compute-0 python3.9[59204]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:33:56 compute-0 sudo[59202]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:57 compute-0 sudo[59354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvszazgjmxpeahnaxouxmcchseuralqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757636.7943933-306-216672612988719/AnsiballZ_stat.py'
Oct 06 13:33:57 compute-0 sudo[59354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:57 compute-0 python3.9[59356]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:33:57 compute-0 sudo[59354]: pam_unix(sudo:session): session closed for user root
Oct 06 13:33:58 compute-0 sudo[59506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rowvypvtuvrrjjuytgqppehavyxkzggp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757637.6200483-326-175714034823582/AnsiballZ_service_facts.py'
Oct 06 13:33:58 compute-0 sudo[59506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:33:58 compute-0 python3.9[59508]: ansible-service_facts Invoked
Oct 06 13:33:58 compute-0 network[59525]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 06 13:33:58 compute-0 network[59526]: 'network-scripts' will be removed from distribution in near future.
Oct 06 13:33:58 compute-0 network[59527]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 06 13:34:05 compute-0 sudo[59506]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:06 compute-0 sudo[59812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srsrlyqmueocrvlegwcvvpmgmqoasanb ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759757645.8053718-352-7829638361399/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759757645.8053718-352-7829638361399/args'
Oct 06 13:34:06 compute-0 sudo[59812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:06 compute-0 sudo[59812]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:06 compute-0 sudo[59979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szlbdlthjefcwvtnvfiyjrjyuzsypxba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757646.6012623-374-72033218736401/AnsiballZ_dnf.py'
Oct 06 13:34:06 compute-0 sudo[59979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:07 compute-0 python3.9[59981]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:34:08 compute-0 sudo[59979]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:09 compute-0 sudo[60132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjsinyxnoeqrqgitdxeyhnnrljvrnapz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757648.7020423-400-11713085805749/AnsiballZ_package_facts.py'
Oct 06 13:34:09 compute-0 sudo[60132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:09 compute-0 python3.9[60134]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct 06 13:34:09 compute-0 sudo[60132]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:10 compute-0 sudo[60284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrxmndozhldzovkbyscdavujlxmwaobn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757650.3190885-420-239526018525086/AnsiballZ_stat.py'
Oct 06 13:34:10 compute-0 sudo[60284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:10 compute-0 python3.9[60286]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:10 compute-0 sudo[60284]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:11 compute-0 sudo[60409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qupeabalztlztmgemueoyzjzvqnbtnst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757650.3190885-420-239526018525086/AnsiballZ_copy.py'
Oct 06 13:34:11 compute-0 sudo[60409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:11 compute-0 python3.9[60411]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757650.3190885-420-239526018525086/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:11 compute-0 sudo[60409]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:12 compute-0 sudo[60563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xucmzkuqxygpojgudzyapnmudqbbibpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757651.8713915-450-86400009267007/AnsiballZ_stat.py'
Oct 06 13:34:12 compute-0 sudo[60563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:12 compute-0 python3.9[60565]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:12 compute-0 sudo[60563]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:12 compute-0 sudo[60688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctorxyustntuuwqteizvauoepvhxxchl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757651.8713915-450-86400009267007/AnsiballZ_copy.py'
Oct 06 13:34:12 compute-0 sudo[60688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:13 compute-0 python3.9[60690]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757651.8713915-450-86400009267007/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:13 compute-0 sudo[60688]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:14 compute-0 sudo[60842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmohklbnwzwelrkrxyivksltpbrhuqza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757653.7772353-492-265632540597214/AnsiballZ_lineinfile.py'
Oct 06 13:34:14 compute-0 sudo[60842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:14 compute-0 python3.9[60844]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:14 compute-0 sudo[60842]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:15 compute-0 sudo[60996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqsxdlsqtfyyppbjjgdnotdbodrqrizx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757655.302149-522-120603686079178/AnsiballZ_setup.py'
Oct 06 13:34:15 compute-0 sudo[60996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:15 compute-0 python3.9[60998]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:34:16 compute-0 sudo[60996]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:16 compute-0 sudo[61080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jffedhrpctvypvxlnpqlkxgrtktmlmbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757655.302149-522-120603686079178/AnsiballZ_systemd.py'
Oct 06 13:34:16 compute-0 sudo[61080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:17 compute-0 python3.9[61082]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:34:17 compute-0 sudo[61080]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:18 compute-0 sudo[61234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erneggmrlpexgwkzyzmybnqootncvvud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757657.7858846-554-125799765143419/AnsiballZ_setup.py'
Oct 06 13:34:18 compute-0 sudo[61234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:18 compute-0 python3.9[61236]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:34:18 compute-0 sudo[61234]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:18 compute-0 sudo[61318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzfomybxwqkcluomtavdynnnebxtmaal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757657.7858846-554-125799765143419/AnsiballZ_systemd.py'
Oct 06 13:34:18 compute-0 sudo[61318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:19 compute-0 python3.9[61320]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:34:19 compute-0 chronyd[787]: chronyd exiting
Oct 06 13:34:19 compute-0 systemd[1]: Stopping NTP client/server...
Oct 06 13:34:19 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Oct 06 13:34:19 compute-0 systemd[1]: Stopped NTP client/server.
Oct 06 13:34:19 compute-0 systemd[1]: Starting NTP client/server...
Oct 06 13:34:19 compute-0 chronyd[61328]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 06 13:34:19 compute-0 chronyd[61328]: Frequency -26.715 +/- 0.508 ppm read from /var/lib/chrony/drift
Oct 06 13:34:19 compute-0 chronyd[61328]: Loaded seccomp filter (level 2)
Oct 06 13:34:19 compute-0 systemd[1]: Started NTP client/server.
Oct 06 13:34:19 compute-0 sudo[61318]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:20 compute-0 sshd-session[56625]: Connection closed by 192.168.122.30 port 45622
Oct 06 13:34:20 compute-0 sshd-session[56622]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:34:20 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Oct 06 13:34:20 compute-0 systemd[1]: session-14.scope: Consumed 28.791s CPU time.
Oct 06 13:34:20 compute-0 systemd-logind[789]: Session 14 logged out. Waiting for processes to exit.
Oct 06 13:34:20 compute-0 systemd-logind[789]: Removed session 14.
Oct 06 13:34:25 compute-0 sshd-session[61354]: Accepted publickey for zuul from 192.168.122.30 port 60504 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:34:25 compute-0 systemd-logind[789]: New session 15 of user zuul.
Oct 06 13:34:25 compute-0 systemd[1]: Started Session 15 of User zuul.
Oct 06 13:34:25 compute-0 sshd-session[61354]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:34:26 compute-0 python3.9[61507]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:34:27 compute-0 sudo[61661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbsqrnirpyfymbzmuwaoectorlhyensx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757667.0672925-46-1823739999174/AnsiballZ_file.py'
Oct 06 13:34:27 compute-0 sudo[61661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:27 compute-0 python3.9[61663]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:27 compute-0 sudo[61661]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:28 compute-0 sudo[61836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jczmykkwoqjqvfzgaygagvpereonkvqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757667.9266748-62-76173306013774/AnsiballZ_stat.py'
Oct 06 13:34:28 compute-0 sudo[61836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:28 compute-0 python3.9[61838]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:28 compute-0 sudo[61836]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:28 compute-0 sudo[61914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpwjmwelududyazlsfqwarkmwgeueinm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757667.9266748-62-76173306013774/AnsiballZ_file.py'
Oct 06 13:34:28 compute-0 sudo[61914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:29 compute-0 python3.9[61916]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.4naqgrmq recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:29 compute-0 sudo[61914]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:29 compute-0 sudo[62066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gidhribiqwmqwgpawtprwhysteimakvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757669.545666-102-81264170735442/AnsiballZ_stat.py'
Oct 06 13:34:29 compute-0 sudo[62066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:30 compute-0 python3.9[62068]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:30 compute-0 sudo[62066]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:30 compute-0 sudo[62189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpocyfuvivknhlmulsiqjaklhihzryqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757669.545666-102-81264170735442/AnsiballZ_copy.py'
Oct 06 13:34:30 compute-0 sudo[62189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:30 compute-0 python3.9[62191]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757669.545666-102-81264170735442/.source _original_basename=.6r81ck9i follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:30 compute-0 sudo[62189]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:31 compute-0 sudo[62341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssefrbwjbetweearhhqvqddjnztvcmok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757670.8655446-134-36292961126152/AnsiballZ_file.py'
Oct 06 13:34:31 compute-0 sudo[62341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:31 compute-0 python3.9[62343]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:34:31 compute-0 sudo[62341]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:31 compute-0 sudo[62493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppmthflbqlasrhwncbfjcxrdaqwolfcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757671.6150334-150-25127658122742/AnsiballZ_stat.py'
Oct 06 13:34:31 compute-0 sudo[62493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:32 compute-0 python3.9[62495]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:32 compute-0 sudo[62493]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:32 compute-0 sudo[62616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdemlqbnmowswqnynxsoimuqannhqyds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757671.6150334-150-25127658122742/AnsiballZ_copy.py'
Oct 06 13:34:32 compute-0 sudo[62616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:32 compute-0 python3.9[62618]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757671.6150334-150-25127658122742/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:34:32 compute-0 sudo[62616]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:33 compute-0 sudo[62768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvzpletjwmkrqwwynszkrxgpbznmqddl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757672.9702957-150-14531145555601/AnsiballZ_stat.py'
Oct 06 13:34:33 compute-0 sudo[62768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:33 compute-0 python3.9[62770]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:33 compute-0 sudo[62768]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:33 compute-0 sudo[62891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgboymqkyylwbbrschyzgmrkpdbjkhad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757672.9702957-150-14531145555601/AnsiballZ_copy.py'
Oct 06 13:34:33 compute-0 sudo[62891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:34 compute-0 python3.9[62893]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757672.9702957-150-14531145555601/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:34:34 compute-0 sudo[62891]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:34 compute-0 sudo[63043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaugvazhvjzteiqnwybbjccbeutsrbyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757674.2399-208-261162969904524/AnsiballZ_file.py'
Oct 06 13:34:34 compute-0 sudo[63043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:34 compute-0 python3.9[63045]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:34 compute-0 sudo[63043]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:35 compute-0 sudo[63195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezgaspbybptqsrxxatagqtcruyjubdyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757675.1025004-224-152412290547212/AnsiballZ_stat.py'
Oct 06 13:34:35 compute-0 sudo[63195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:35 compute-0 python3.9[63197]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:35 compute-0 sudo[63195]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:36 compute-0 sudo[63318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyrokborlcvsqvpkvsvsuxlqhassgaye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757675.1025004-224-152412290547212/AnsiballZ_copy.py'
Oct 06 13:34:36 compute-0 sudo[63318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:36 compute-0 python3.9[63320]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757675.1025004-224-152412290547212/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:36 compute-0 sudo[63318]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:36 compute-0 sudo[63470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkqlyrsswqcbbpindapwnjjxojlcfubu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757676.482685-254-117114003243468/AnsiballZ_stat.py'
Oct 06 13:34:36 compute-0 sudo[63470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:37 compute-0 python3.9[63472]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:37 compute-0 sudo[63470]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:37 compute-0 sudo[63593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvzebronyamujupjbylestopllqbkxvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757676.482685-254-117114003243468/AnsiballZ_copy.py'
Oct 06 13:34:37 compute-0 sudo[63593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:37 compute-0 python3.9[63595]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757676.482685-254-117114003243468/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:37 compute-0 sudo[63593]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:38 compute-0 sudo[63745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoprxrgawhyfmsmhgddmkjsbrpmzrcpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757677.865989-284-155184801953164/AnsiballZ_systemd.py'
Oct 06 13:34:38 compute-0 sudo[63745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:38 compute-0 python3.9[63747]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:34:38 compute-0 systemd[1]: Reloading.
Oct 06 13:34:38 compute-0 systemd-rc-local-generator[63775]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:34:38 compute-0 systemd-sysv-generator[63779]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:34:39 compute-0 systemd[1]: Reloading.
Oct 06 13:34:39 compute-0 systemd-rc-local-generator[63811]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:34:39 compute-0 systemd-sysv-generator[63814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:34:39 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Oct 06 13:34:39 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Oct 06 13:34:39 compute-0 sudo[63745]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:39 compute-0 sudo[63971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsgmwnpdpgxschspsxlmpqbgvkfdnyhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757679.5868554-300-272277355917507/AnsiballZ_stat.py'
Oct 06 13:34:39 compute-0 sudo[63971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:40 compute-0 python3.9[63973]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:40 compute-0 sudo[63971]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:40 compute-0 sudo[64094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubqrbyuftigoztlhweyxvpenlpxdbmjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757679.5868554-300-272277355917507/AnsiballZ_copy.py'
Oct 06 13:34:40 compute-0 sudo[64094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:40 compute-0 python3.9[64096]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757679.5868554-300-272277355917507/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:40 compute-0 sudo[64094]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:41 compute-0 sudo[64246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfmhelgpthmwinsfeublmitydeqspbjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757680.9760413-330-86984984130990/AnsiballZ_stat.py'
Oct 06 13:34:41 compute-0 sudo[64246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:41 compute-0 python3.9[64248]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:41 compute-0 sudo[64246]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:41 compute-0 sudo[64369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvrodwzleuxculpecnbdmxzyrmvnyrwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757680.9760413-330-86984984130990/AnsiballZ_copy.py'
Oct 06 13:34:41 compute-0 sudo[64369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:42 compute-0 python3.9[64371]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757680.9760413-330-86984984130990/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:42 compute-0 sudo[64369]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:42 compute-0 sudo[64521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upjrkkvjnawqgvjagcdbsicmordniitk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757682.3446205-360-117111936439027/AnsiballZ_systemd.py'
Oct 06 13:34:42 compute-0 sudo[64521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:42 compute-0 python3.9[64523]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:34:42 compute-0 systemd[1]: Reloading.
Oct 06 13:34:43 compute-0 systemd-rc-local-generator[64552]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:34:43 compute-0 systemd-sysv-generator[64556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:34:43 compute-0 systemd[1]: Reloading.
Oct 06 13:34:43 compute-0 systemd-sysv-generator[64594]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:34:43 compute-0 systemd-rc-local-generator[64587]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:34:43 compute-0 systemd[1]: Starting Create netns directory...
Oct 06 13:34:43 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 06 13:34:43 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 06 13:34:43 compute-0 systemd[1]: Finished Create netns directory.
Oct 06 13:34:43 compute-0 sudo[64521]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:44 compute-0 python3.9[64751]: ansible-ansible.builtin.service_facts Invoked
Oct 06 13:34:44 compute-0 network[64768]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 06 13:34:44 compute-0 network[64769]: 'network-scripts' will be removed from distribution in near future.
Oct 06 13:34:44 compute-0 network[64770]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 06 13:34:48 compute-0 sudo[65032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibjlcndtimxaewoaebnpboacroetlmaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757687.9923084-392-121137422232201/AnsiballZ_systemd.py'
Oct 06 13:34:48 compute-0 sudo[65032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:48 compute-0 python3.9[65034]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:34:48 compute-0 systemd[1]: Reloading.
Oct 06 13:34:48 compute-0 systemd-rc-local-generator[65062]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:34:48 compute-0 systemd-sysv-generator[65069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:34:49 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Oct 06 13:34:49 compute-0 iptables.init[65075]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct 06 13:34:49 compute-0 iptables.init[65075]: iptables: Flushing firewall rules: [  OK  ]
Oct 06 13:34:49 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Oct 06 13:34:49 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Oct 06 13:34:49 compute-0 sudo[65032]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:49 compute-0 sudo[65270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzjztfluibpdkqmrqmjqyrwfpjfvqhuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757689.5569358-392-140990051114860/AnsiballZ_systemd.py'
Oct 06 13:34:49 compute-0 sudo[65270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:50 compute-0 python3.9[65272]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:34:50 compute-0 sudo[65270]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:50 compute-0 sudo[65424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvomjlayebeotmzchdalxnjbvmdivoxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757690.5467868-424-242386710527197/AnsiballZ_systemd.py'
Oct 06 13:34:50 compute-0 sudo[65424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:55 compute-0 python3.9[65426]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:34:55 compute-0 systemd[1]: Reloading.
Oct 06 13:34:55 compute-0 systemd-rc-local-generator[65452]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:34:55 compute-0 systemd-sysv-generator[65456]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:34:55 compute-0 systemd[1]: Starting Netfilter Tables...
Oct 06 13:34:55 compute-0 systemd[1]: Finished Netfilter Tables.
Oct 06 13:34:55 compute-0 sudo[65424]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:56 compute-0 sudo[65616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nutwqjfpyobentivbjshvvuljdyratqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757696.1238072-440-90028455786557/AnsiballZ_command.py'
Oct 06 13:34:56 compute-0 sudo[65616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:56 compute-0 python3.9[65618]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:34:56 compute-0 sudo[65616]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:57 compute-0 sudo[65769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffvukvglkzbdccvxkjgmhttrbqgkqqfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757697.5503237-468-248099875085557/AnsiballZ_stat.py'
Oct 06 13:34:57 compute-0 sudo[65769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:58 compute-0 python3.9[65771]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:34:58 compute-0 sudo[65769]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:58 compute-0 sudo[65894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poonnbtwkicjlpbdgekgeaawuhcwkjjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757697.5503237-468-248099875085557/AnsiballZ_copy.py'
Oct 06 13:34:58 compute-0 sudo[65894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:34:58 compute-0 python3.9[65896]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757697.5503237-468-248099875085557/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:34:58 compute-0 sudo[65894]: pam_unix(sudo:session): session closed for user root
Oct 06 13:34:59 compute-0 python3.9[66047]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:34:59 compute-0 polkitd[6342]: Registered Authentication Agent for unix-process:66049:216156 (system bus name :1.555 [/usr/bin/pkttyagent --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Oct 06 13:35:24 compute-0 polkitd[6342]: Unregistered Authentication Agent for unix-process:66049:216156 (system bus name :1.555, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Oct 06 13:35:24 compute-0 polkit-agent-helper-1[66061]: pam_unix(polkit-1:auth): conversation failed
Oct 06 13:35:24 compute-0 polkit-agent-helper-1[66061]: pam_unix(polkit-1:auth): auth could not identify password for [root]
Oct 06 13:35:24 compute-0 polkitd[6342]: Operator of unix-process:66049:216156 FAILED to authenticate to gain authorization for action org.freedesktop.systemd1.manage-units for system-bus-name::1.554 [<unknown>] (owned by unix-user:zuul)
Oct 06 13:35:25 compute-0 sshd-session[61357]: Connection closed by 192.168.122.30 port 60504
Oct 06 13:35:25 compute-0 sshd-session[61354]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:35:25 compute-0 systemd-logind[789]: Session 15 logged out. Waiting for processes to exit.
Oct 06 13:35:25 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Oct 06 13:35:25 compute-0 systemd[1]: session-15.scope: Consumed 22.616s CPU time.
Oct 06 13:35:25 compute-0 systemd-logind[789]: Removed session 15.
Oct 06 13:35:26 compute-0 sshd-session[66087]: Invalid user banxgg from 45.148.10.240 port 44064
Oct 06 13:35:26 compute-0 sshd-session[66087]: Connection closed by invalid user banxgg 45.148.10.240 port 44064 [preauth]
Oct 06 13:35:38 compute-0 sshd-session[66089]: Accepted publickey for zuul from 192.168.122.30 port 45754 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:35:38 compute-0 systemd-logind[789]: New session 16 of user zuul.
Oct 06 13:35:38 compute-0 systemd[1]: Started Session 16 of User zuul.
Oct 06 13:35:38 compute-0 sshd-session[66089]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:35:39 compute-0 python3.9[66242]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:35:40 compute-0 sudo[66396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqnclmgnqyoizuwvtghgzhqoprdrltip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757739.8397963-46-241283080047084/AnsiballZ_file.py'
Oct 06 13:35:40 compute-0 sudo[66396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:40 compute-0 python3.9[66398]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:35:40 compute-0 sudo[66396]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:41 compute-0 sudo[66571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klbgjpmapnafdlxanzgpaecdhmrekxvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757740.818379-62-235568510175586/AnsiballZ_stat.py'
Oct 06 13:35:41 compute-0 sudo[66571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:41 compute-0 python3.9[66573]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:35:41 compute-0 sudo[66571]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:41 compute-0 sudo[66649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-civelcokugvlmpxosuggmivszbwwdizh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757740.818379-62-235568510175586/AnsiballZ_file.py'
Oct 06 13:35:41 compute-0 sudo[66649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:42 compute-0 python3.9[66651]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.h5_flp35 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:35:42 compute-0 sudo[66649]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:43 compute-0 sudo[66801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahbkmiegiqepskynrfojhalvosrvrloh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757742.650756-102-48845663303841/AnsiballZ_stat.py'
Oct 06 13:35:43 compute-0 sudo[66801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:43 compute-0 python3.9[66803]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:35:43 compute-0 sudo[66801]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:43 compute-0 sudo[66879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfzgzgxwlahbreoaqjoodiebobknbkjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757742.650756-102-48845663303841/AnsiballZ_file.py'
Oct 06 13:35:43 compute-0 sudo[66879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:43 compute-0 python3.9[66881]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.mfaoiyjt recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:35:43 compute-0 sudo[66879]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:44 compute-0 sudo[67031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwdtzwhvxxigvwebvyeyzeqwffswpccw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757744.0353534-128-18086568739978/AnsiballZ_file.py'
Oct 06 13:35:44 compute-0 sudo[67031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:44 compute-0 python3.9[67033]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:35:44 compute-0 sudo[67031]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:45 compute-0 sudo[67183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpieiyntblwfxkothsgvgtstagewbdmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757744.8302307-144-31971292019481/AnsiballZ_stat.py'
Oct 06 13:35:45 compute-0 sudo[67183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:45 compute-0 python3.9[67185]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:35:45 compute-0 sudo[67183]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:45 compute-0 sudo[67261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcmamecddpiqevvadgqovidrahadsvbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757744.8302307-144-31971292019481/AnsiballZ_file.py'
Oct 06 13:35:45 compute-0 sudo[67261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:45 compute-0 python3.9[67263]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:35:45 compute-0 sudo[67261]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:46 compute-0 sudo[67413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxejxhfoayqdlspggztigcaqxaxvdjbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757746.0707457-144-101853196944545/AnsiballZ_stat.py'
Oct 06 13:35:46 compute-0 sudo[67413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:46 compute-0 python3.9[67415]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:35:46 compute-0 sudo[67413]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:46 compute-0 sudo[67491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blyfbkidwgjigfgssmtoohrtpjecbgue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757746.0707457-144-101853196944545/AnsiballZ_file.py'
Oct 06 13:35:46 compute-0 sudo[67491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:47 compute-0 python3.9[67493]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:35:47 compute-0 sudo[67491]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:47 compute-0 sudo[67643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exnflbyzqybecnqhwigcuovsimftaupx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757747.3716488-190-223947340676204/AnsiballZ_file.py'
Oct 06 13:35:47 compute-0 sudo[67643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:47 compute-0 python3.9[67645]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:35:47 compute-0 sudo[67643]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:48 compute-0 sudo[67795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cebwxbhuoqxoqplayqgpualwnpvqdqpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757748.183094-206-79187923065954/AnsiballZ_stat.py'
Oct 06 13:35:48 compute-0 sudo[67795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:48 compute-0 python3.9[67797]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:35:48 compute-0 sudo[67795]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:49 compute-0 sudo[67873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugeruwutmpmgzsgojxcnyqtzfliqxlko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757748.183094-206-79187923065954/AnsiballZ_file.py'
Oct 06 13:35:49 compute-0 sudo[67873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:49 compute-0 python3.9[67875]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:35:49 compute-0 sudo[67873]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:49 compute-0 sudo[68025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfeilngrkvjcmzhuptpqciejjrbgnatu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757749.5366004-230-147289555035204/AnsiballZ_stat.py'
Oct 06 13:35:49 compute-0 sudo[68025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:50 compute-0 python3.9[68027]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:35:50 compute-0 sudo[68025]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:50 compute-0 sudo[68103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdaoaxuunnkxucipzhhvpwnsgqvzeiqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757749.5366004-230-147289555035204/AnsiballZ_file.py'
Oct 06 13:35:50 compute-0 sudo[68103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:50 compute-0 python3.9[68105]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:35:50 compute-0 sudo[68103]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:51 compute-0 sudo[68255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndwwvkzlkrxinppqrqeavjxtwijltpuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757750.768096-254-58682415406787/AnsiballZ_systemd.py'
Oct 06 13:35:51 compute-0 sudo[68255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:51 compute-0 python3.9[68257]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:35:51 compute-0 systemd[1]: Reloading.
Oct 06 13:35:51 compute-0 systemd-sysv-generator[68286]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:35:51 compute-0 systemd-rc-local-generator[68279]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:35:52 compute-0 sudo[68255]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:53 compute-0 sudo[68443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctdsxbpzxbqnasrcartupjauoudkwudu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757753.1011949-270-249919815623489/AnsiballZ_stat.py'
Oct 06 13:35:53 compute-0 sudo[68443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:53 compute-0 python3.9[68445]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:35:53 compute-0 sudo[68443]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:53 compute-0 sudo[68521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfnpymklgjinorgwlrrxsfpvbsnzycbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757753.1011949-270-249919815623489/AnsiballZ_file.py'
Oct 06 13:35:53 compute-0 sudo[68521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:54 compute-0 python3.9[68523]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:35:54 compute-0 sudo[68521]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:54 compute-0 sudo[68673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbxqslzttmjdsrgxgdlztkbolxjvpcjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757754.459872-294-229063154258620/AnsiballZ_stat.py'
Oct 06 13:35:54 compute-0 sudo[68673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:55 compute-0 python3.9[68675]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:35:55 compute-0 sudo[68673]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:55 compute-0 sudo[68751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bveafyemljxrwthbjqigpagbralkhzcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757754.459872-294-229063154258620/AnsiballZ_file.py'
Oct 06 13:35:55 compute-0 sudo[68751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:55 compute-0 python3.9[68753]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:35:55 compute-0 sudo[68751]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:56 compute-0 sudo[68903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gohjhcrdectuoznmacqxuisbwynekcle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757755.8114812-318-251006368705522/AnsiballZ_systemd.py'
Oct 06 13:35:56 compute-0 sudo[68903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:35:56 compute-0 python3.9[68905]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:35:56 compute-0 systemd[1]: Reloading.
Oct 06 13:35:56 compute-0 systemd-rc-local-generator[68931]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:35:56 compute-0 systemd-sysv-generator[68935]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:35:56 compute-0 systemd[1]: Starting Create netns directory...
Oct 06 13:35:56 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 06 13:35:56 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 06 13:35:56 compute-0 systemd[1]: Finished Create netns directory.
Oct 06 13:35:56 compute-0 sudo[68903]: pam_unix(sudo:session): session closed for user root
Oct 06 13:35:57 compute-0 python3.9[69095]: ansible-ansible.builtin.service_facts Invoked
Oct 06 13:35:57 compute-0 network[69112]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 06 13:35:57 compute-0 network[69113]: 'network-scripts' will be removed from distribution in near future.
Oct 06 13:35:57 compute-0 network[69114]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 06 13:36:03 compute-0 sudo[69375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oapahvchusfahvgtcjnsypbifonetjfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757762.9554613-370-48221841951053/AnsiballZ_stat.py'
Oct 06 13:36:03 compute-0 sudo[69375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:03 compute-0 python3.9[69377]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:03 compute-0 sudo[69375]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:03 compute-0 sudo[69453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgsrblynncqtqdjpyvsyuqjmjjsxxgyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757762.9554613-370-48221841951053/AnsiballZ_file.py'
Oct 06 13:36:03 compute-0 sudo[69453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:04 compute-0 python3.9[69455]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:04 compute-0 sudo[69453]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:04 compute-0 sudo[69605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnlyekxdecsjoyqvjidvxshpazodoxil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757764.5064182-396-264857874329922/AnsiballZ_file.py'
Oct 06 13:36:04 compute-0 sudo[69605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:05 compute-0 python3.9[69607]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:05 compute-0 sudo[69605]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:05 compute-0 sudo[69757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weefronunoiopdzxchkuahhkpuawabuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757765.2943015-412-68756993631232/AnsiballZ_stat.py'
Oct 06 13:36:05 compute-0 sudo[69757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:05 compute-0 python3.9[69759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:05 compute-0 sudo[69757]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:06 compute-0 sudo[69880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuetuudzoiuripmigjxqgxcgnozaylol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757765.2943015-412-68756993631232/AnsiballZ_copy.py'
Oct 06 13:36:06 compute-0 sudo[69880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:06 compute-0 python3.9[69882]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757765.2943015-412-68756993631232/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:06 compute-0 sudo[69880]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:07 compute-0 sudo[70032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awwflzxseicdpfgzecmgpgzdzfhucymj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757766.9261734-448-164626312756493/AnsiballZ_timezone.py'
Oct 06 13:36:07 compute-0 sudo[70032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:07 compute-0 python3.9[70034]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 06 13:36:07 compute-0 systemd[1]: Starting Time & Date Service...
Oct 06 13:36:07 compute-0 systemd[1]: Started Time & Date Service.
Oct 06 13:36:07 compute-0 sudo[70032]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:08 compute-0 sudo[70188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzcfnxeoytqrecfqenrcihvjrzotcetx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757768.1607811-466-255294482578672/AnsiballZ_file.py'
Oct 06 13:36:08 compute-0 sudo[70188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:08 compute-0 python3.9[70190]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:08 compute-0 sudo[70188]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:09 compute-0 sudo[70340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whdskjslvoyriwcwafsubhmazjrzpnzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757768.9404173-482-214158046612425/AnsiballZ_stat.py'
Oct 06 13:36:09 compute-0 sudo[70340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:09 compute-0 python3.9[70342]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:09 compute-0 sudo[70340]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:09 compute-0 sudo[70463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvyycgqukkawulqyrsprngyvidalpicg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757768.9404173-482-214158046612425/AnsiballZ_copy.py'
Oct 06 13:36:09 compute-0 sudo[70463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:10 compute-0 python3.9[70465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757768.9404173-482-214158046612425/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:10 compute-0 sudo[70463]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:10 compute-0 sudo[70615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiybsllzlojzbdgjtgvwtebrmvwbrabg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757770.4517903-512-56574052622554/AnsiballZ_stat.py'
Oct 06 13:36:10 compute-0 sudo[70615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:11 compute-0 python3.9[70617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:11 compute-0 sudo[70615]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:11 compute-0 sudo[70738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkasdiwaagiwtcukugwermizgzbrlreu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757770.4517903-512-56574052622554/AnsiballZ_copy.py'
Oct 06 13:36:11 compute-0 sudo[70738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:11 compute-0 python3.9[70740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757770.4517903-512-56574052622554/.source.yaml _original_basename=.xavq6yql follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:11 compute-0 sudo[70738]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:12 compute-0 sudo[70890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-katifnzxnexfssodlbhywynwwmoiiydd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757771.950653-542-262909222902481/AnsiballZ_stat.py'
Oct 06 13:36:12 compute-0 sudo[70890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:12 compute-0 python3.9[70892]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:12 compute-0 sudo[70890]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:12 compute-0 sudo[71013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgbvheyeeexuzxetsbypnvqtydsaapzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757771.950653-542-262909222902481/AnsiballZ_copy.py'
Oct 06 13:36:12 compute-0 sudo[71013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:13 compute-0 python3.9[71015]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757771.950653-542-262909222902481/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:13 compute-0 sudo[71013]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:13 compute-0 sudo[71165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoxsrxezninppblpqhevucalysxyvqqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757773.297921-572-37813684449656/AnsiballZ_command.py'
Oct 06 13:36:13 compute-0 sudo[71165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:14 compute-0 python3.9[71167]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:36:14 compute-0 sudo[71165]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:14 compute-0 sudo[71318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppqyblsnvsvmrstvcxwcieknwkhaqqrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757774.2472389-588-82022863758132/AnsiballZ_command.py'
Oct 06 13:36:14 compute-0 sudo[71318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:14 compute-0 python3.9[71320]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:36:14 compute-0 sudo[71318]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:15 compute-0 sudo[71471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpqehcmcgrzvhvqcihwmcnolaqxwbsel ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759757774.9851322-604-71921802354497/AnsiballZ_edpm_nftables_from_files.py'
Oct 06 13:36:15 compute-0 sudo[71471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:15 compute-0 python3[71473]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 06 13:36:15 compute-0 sudo[71471]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:16 compute-0 sudo[71623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upuakrhqoxvbidfqohfzfiivgwnnpfeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757775.8900132-620-7683418154418/AnsiballZ_stat.py'
Oct 06 13:36:16 compute-0 sudo[71623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:16 compute-0 python3.9[71625]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:16 compute-0 sudo[71623]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:16 compute-0 sudo[71746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scndljfanivuboflncuqfoegcqscjyso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757775.8900132-620-7683418154418/AnsiballZ_copy.py'
Oct 06 13:36:16 compute-0 sudo[71746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:17 compute-0 python3.9[71748]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757775.8900132-620-7683418154418/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:17 compute-0 sudo[71746]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:17 compute-0 sudo[71898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnxmhqxssgpxscfszyjkxcybeumgqegn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757777.2219515-650-170656426834418/AnsiballZ_stat.py'
Oct 06 13:36:17 compute-0 sudo[71898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:17 compute-0 python3.9[71900]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:17 compute-0 sudo[71898]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:18 compute-0 sudo[72021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xikrqxkddhhzcodztohruojawttjohdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757777.2219515-650-170656426834418/AnsiballZ_copy.py'
Oct 06 13:36:18 compute-0 sudo[72021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:18 compute-0 python3.9[72023]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757777.2219515-650-170656426834418/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:18 compute-0 sudo[72021]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:19 compute-0 sudo[72173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceobwzpfubyuqdebstjmevhmqnucappn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757778.656475-680-100084865375581/AnsiballZ_stat.py'
Oct 06 13:36:19 compute-0 sudo[72173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:19 compute-0 python3.9[72175]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:19 compute-0 sudo[72173]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:19 compute-0 sudo[72296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yacgbyvlvdotlrybwgiqjxiyaxlmjyxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757778.656475-680-100084865375581/AnsiballZ_copy.py'
Oct 06 13:36:19 compute-0 sudo[72296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:19 compute-0 python3.9[72298]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757778.656475-680-100084865375581/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:19 compute-0 sudo[72296]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:20 compute-0 sudo[72448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsvusdhhibgtcfwfxekpizvzpwlpejfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757780.1142864-710-191673473290948/AnsiballZ_stat.py'
Oct 06 13:36:20 compute-0 sudo[72448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:20 compute-0 python3.9[72450]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:20 compute-0 sudo[72448]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:21 compute-0 sudo[72571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvjsirpohxlvxrksscbtdpqoodykmhqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757780.1142864-710-191673473290948/AnsiballZ_copy.py'
Oct 06 13:36:21 compute-0 sudo[72571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:21 compute-0 python3.9[72573]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757780.1142864-710-191673473290948/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:21 compute-0 sudo[72571]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:21 compute-0 sudo[72723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojdrkihivugxscvysbbubmvxcnjdayxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757781.5421689-740-272640109269168/AnsiballZ_stat.py'
Oct 06 13:36:21 compute-0 sudo[72723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:22 compute-0 python3.9[72725]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:36:22 compute-0 sudo[72723]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:22 compute-0 sudo[72846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itqnbyhhxmlppswpxcbvsbvrolhkcjss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757781.5421689-740-272640109269168/AnsiballZ_copy.py'
Oct 06 13:36:22 compute-0 sudo[72846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:22 compute-0 python3.9[72848]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757781.5421689-740-272640109269168/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:22 compute-0 sudo[72846]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:23 compute-0 sudo[72998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huwhdbfhmyescufdovwjvyqustdaakmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757783.0429707-770-83801761829564/AnsiballZ_file.py'
Oct 06 13:36:23 compute-0 sudo[72998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:23 compute-0 python3.9[73000]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:23 compute-0 sudo[72998]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:24 compute-0 sudo[73150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgeorxdlkyjbetbgqqjhzspiwxtelkmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757783.8124187-786-147258642996029/AnsiballZ_command.py'
Oct 06 13:36:24 compute-0 sudo[73150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:24 compute-0 python3.9[73152]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:36:24 compute-0 sudo[73150]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:25 compute-0 sudo[73309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klfcfwbefwiuwgqmwtpatgglqbgqymlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757784.720821-802-271831546060511/AnsiballZ_blockinfile.py'
Oct 06 13:36:25 compute-0 sudo[73309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:25 compute-0 python3.9[73311]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:25 compute-0 sudo[73309]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:26 compute-0 sudo[73462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owoqyzuscbuimcioevdmyupalkkfpukm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757785.7415178-820-165300988788725/AnsiballZ_file.py'
Oct 06 13:36:26 compute-0 sudo[73462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:26 compute-0 python3.9[73464]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:26 compute-0 sudo[73462]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:26 compute-0 sudo[73614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftmhixzksacpyigaopycfxfcvmyaewtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757786.4068315-820-38727627696823/AnsiballZ_file.py'
Oct 06 13:36:26 compute-0 sudo[73614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:26 compute-0 python3.9[73616]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:27 compute-0 sudo[73614]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:27 compute-0 sudo[73766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aftcluzvjbodjtvvbbkivoytrudpmbcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757787.2040021-850-120292262634483/AnsiballZ_mount.py'
Oct 06 13:36:27 compute-0 sudo[73766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:27 compute-0 python3.9[73768]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 06 13:36:27 compute-0 sudo[73766]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:27 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 13:36:27 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 13:36:28 compute-0 sudo[73920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssvhvitvzldwlzlmocqfqcjwirodjtdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757788.0543623-850-22655634753690/AnsiballZ_mount.py'
Oct 06 13:36:28 compute-0 sudo[73920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:28 compute-0 python3.9[73922]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 06 13:36:28 compute-0 sudo[73920]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:28 compute-0 sshd-session[66092]: Connection closed by 192.168.122.30 port 45754
Oct 06 13:36:28 compute-0 sshd-session[66089]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:36:28 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Oct 06 13:36:28 compute-0 systemd[1]: session-16.scope: Consumed 38.289s CPU time.
Oct 06 13:36:28 compute-0 systemd-logind[789]: Session 16 logged out. Waiting for processes to exit.
Oct 06 13:36:28 compute-0 systemd-logind[789]: Removed session 16.
Oct 06 13:36:29 compute-0 chronyd[61328]: Selected source 23.133.168.246 (pool.ntp.org)
Oct 06 13:36:34 compute-0 sshd-session[73948]: Accepted publickey for zuul from 192.168.122.30 port 60922 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:36:34 compute-0 systemd-logind[789]: New session 17 of user zuul.
Oct 06 13:36:34 compute-0 systemd[1]: Started Session 17 of User zuul.
Oct 06 13:36:34 compute-0 sshd-session[73948]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:36:34 compute-0 sudo[74101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obrvbhysbnhrzvkennjtzaezyzkrehlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757794.335868-17-40461297002129/AnsiballZ_tempfile.py'
Oct 06 13:36:34 compute-0 sudo[74101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:35 compute-0 python3.9[74103]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 06 13:36:35 compute-0 sudo[74101]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:35 compute-0 sudo[74253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sioktngugsbaetgjbacywcoairwggkvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757795.3479807-41-253671864326294/AnsiballZ_stat.py'
Oct 06 13:36:35 compute-0 sudo[74253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:36 compute-0 python3.9[74255]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:36:36 compute-0 sudo[74253]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:37 compute-0 sudo[74405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjxclfogvuwnrveasofmtydzdylbxlzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757796.3938963-61-129611544694289/AnsiballZ_setup.py'
Oct 06 13:36:37 compute-0 sudo[74405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:37 compute-0 python3.9[74407]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:36:37 compute-0 sudo[74405]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:37 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 06 13:36:38 compute-0 sudo[74560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmcaeiycxxsbpcpcklvguceymbayadhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757797.703289-78-98106163097450/AnsiballZ_blockinfile.py'
Oct 06 13:36:38 compute-0 sudo[74560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:38 compute-0 python3.9[74562]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDIuDajvNiDHQWuElohd/VDOwelWl6Rs33bl6I4GKICAse1tImqjiwyzfJFHcNkudLUcuHCfnr2ey/+Lqi0pH02+V4mL0K40cpFpzxYW93eqilVI0kICdrHrEGAk8ambXSL6IdfbeBpq1AdBhGkVFoxz31s0KiVf8eISseR3aG5XJ22Fqpt0bkyRR64eZLCHpHjYjqL7UhNyG6hWMTMgau3c2bJKgiK8dWVYpMYSHM3XD1h9RgWBfPvXPG4RcHcuHZzo/MMCb23FQ42iUrEhlp6ibmeGXNnADafmp9k8d2QUYRNCJ41cSc2GSaMVs+XDxiHD8OjPJtUt33ZpjS5ztTysrF2QfuHKEabtRT3g4pSM4b+mkSqEkTblfY63PFYNZ/j01+9niySdd3zsfBtQvnBPoCnDT/9Va/ToEZSG61Mp8aTd6B6gWi0CRcChTirLyoJ+V+MXAoKa5j5obUUxLbq3x+suYSuGsL7hgvbnnyaWuJVKJcqIcttQikRG9EbQ6M=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFFCQEXH4xm/ETbZyqDDKnWeEB0C4XGVFJ4LmKSaf5Dz
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNNFVKbUE5/nr0F2bcAkLqR7jMob3UthCY43NwN+NbGktJ1WuevLoYPHsD3uNnzEfoqWH9YFwoTfjnR1ZbcwZMs=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDOAUD7+BJBQPlvKrRpdbfIpsJ8Qvom06vnov4wOhDSH6+tlSY8U0Hj4kXSjo+zNln4wuHxqT1e18GQ+M3+n6nJ8/r6mfFZCZW7YAtAc223ELyYVXK+fgW9P2GCA25PoUGoWjjNHjkhEBO7jsijCQRUMSARdFYGYgyyY88P9SEBivoA+fkdoQbofaREy1B0jTcgAfCbib86lDsL0Tr4dorjEJy09VCq2VuV1wsFu1iwmhZ9j28rzUU7nNTBG1pIAAG84sChxfmCeayDIqyD6kDemjz+erAGNmgmGuxAEsh5BEA7l781r9Sd16bScWIIWvK9/0XT6YR823tMn8mgKe4Fo4waQo7A1UwUlrQ2atrCgAgcJTwOHiiUo1jPSoamKe4kiOMsfejK8O/79tDd0fJZJ0fVRaj/bhPCVfDpyLvQ8VEaEQBElcSpB5TnamPAczoEawKSkvbURHlLL1bxFvlcDvvftAyhO5Ka2n0QstaLTuGcVHrSNEtLi+9c0B1difs=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHuU9prufj23rEqz5+oVh55BFhudtdQEwZ6XaBBcCRwZ
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDb18hPNiZwX5WPJNljhASozexD2wMvsJ2LCvvqyxOObSmS6v3ueMuSFKFD8xfHxkjLxTEnPjrYzg4W5bOEF7iI=
                                             create=True mode=0644 path=/tmp/ansible.trjb2aye state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:38 compute-0 sudo[74560]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:39 compute-0 sudo[74712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcdwmhckrshfkyrndvhiywfnoecvatkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757798.6445322-94-188117983827050/AnsiballZ_command.py'
Oct 06 13:36:39 compute-0 sudo[74712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:39 compute-0 python3.9[74714]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.trjb2aye' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:36:39 compute-0 sudo[74712]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:40 compute-0 sudo[74866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rurperrjqfstftrhefeugfyahhhpqjhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757799.5948393-110-21408725088033/AnsiballZ_file.py'
Oct 06 13:36:40 compute-0 sudo[74866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:40 compute-0 python3.9[74868]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.trjb2aye state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:40 compute-0 sudo[74866]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:40 compute-0 sshd-session[73951]: Connection closed by 192.168.122.30 port 60922
Oct 06 13:36:40 compute-0 sshd-session[73948]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:36:40 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Oct 06 13:36:40 compute-0 systemd[1]: session-17.scope: Consumed 4.348s CPU time.
Oct 06 13:36:40 compute-0 systemd-logind[789]: Session 17 logged out. Waiting for processes to exit.
Oct 06 13:36:40 compute-0 systemd-logind[789]: Removed session 17.
Oct 06 13:36:46 compute-0 sshd-session[74893]: Accepted publickey for zuul from 192.168.122.30 port 50858 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:36:46 compute-0 systemd-logind[789]: New session 18 of user zuul.
Oct 06 13:36:46 compute-0 systemd[1]: Started Session 18 of User zuul.
Oct 06 13:36:46 compute-0 sshd-session[74893]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:36:47 compute-0 python3.9[75046]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:36:48 compute-0 sudo[75200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amcdnozyyovesezhfzkeibyqwtiaghjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757807.7492971-44-121493719929718/AnsiballZ_systemd.py'
Oct 06 13:36:48 compute-0 sudo[75200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:48 compute-0 python3.9[75202]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 06 13:36:48 compute-0 sudo[75200]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:49 compute-0 sudo[75354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daynkbphtgkjqwnoqfamvrvoytxzhupb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757808.9677784-60-131984108575607/AnsiballZ_systemd.py'
Oct 06 13:36:49 compute-0 sudo[75354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:49 compute-0 python3.9[75356]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:36:49 compute-0 sudo[75354]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:50 compute-0 sudo[75507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfpopeovzuemypustnmbsvwqjkrxyheu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757810.0038989-78-41763187874913/AnsiballZ_command.py'
Oct 06 13:36:50 compute-0 sudo[75507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:50 compute-0 python3.9[75509]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:36:50 compute-0 sudo[75507]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:51 compute-0 sudo[75660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvrqrvmkvkuycqirqhsylmrlyuujfwqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757810.969239-94-64879015927673/AnsiballZ_stat.py'
Oct 06 13:36:51 compute-0 sudo[75660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:51 compute-0 python3.9[75662]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:36:51 compute-0 sudo[75660]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:52 compute-0 sudo[75814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekfjyzmqnacjkuydrreyhjgpjbpkedxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757811.8421955-110-256308867149292/AnsiballZ_command.py'
Oct 06 13:36:52 compute-0 sudo[75814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:52 compute-0 python3.9[75816]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:36:52 compute-0 sudo[75814]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:53 compute-0 sudo[75969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyffzvnufpvtpvqqridrtszgdhztbzyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757812.6406553-126-250904966919359/AnsiballZ_file.py'
Oct 06 13:36:53 compute-0 sudo[75969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:36:53 compute-0 python3.9[75971]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:36:53 compute-0 sudo[75969]: pam_unix(sudo:session): session closed for user root
Oct 06 13:36:53 compute-0 sshd-session[74896]: Connection closed by 192.168.122.30 port 50858
Oct 06 13:36:53 compute-0 sshd-session[74893]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:36:53 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Oct 06 13:36:53 compute-0 systemd[1]: session-18.scope: Consumed 5.424s CPU time.
Oct 06 13:36:53 compute-0 systemd-logind[789]: Session 18 logged out. Waiting for processes to exit.
Oct 06 13:36:53 compute-0 systemd-logind[789]: Removed session 18.
Oct 06 13:36:59 compute-0 sshd-session[75996]: Accepted publickey for zuul from 192.168.122.30 port 34826 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:36:59 compute-0 systemd-logind[789]: New session 19 of user zuul.
Oct 06 13:36:59 compute-0 systemd[1]: Started Session 19 of User zuul.
Oct 06 13:36:59 compute-0 sshd-session[75996]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:37:00 compute-0 python3.9[76149]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:37:01 compute-0 sudo[76303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csqlikvtbfdiuyacfzyqytfpxmxmhblj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757820.745052-48-159910981437984/AnsiballZ_setup.py'
Oct 06 13:37:01 compute-0 sudo[76303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:01 compute-0 python3.9[76305]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:37:01 compute-0 sudo[76303]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:01 compute-0 sudo[76387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soexwttepuurorqmjsidtsvrjbylocmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757820.745052-48-159910981437984/AnsiballZ_dnf.py'
Oct 06 13:37:01 compute-0 sudo[76387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:02 compute-0 python3.9[76389]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 06 13:37:03 compute-0 sudo[76387]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:04 compute-0 python3.9[76540]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:37:05 compute-0 python3.9[76691]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 06 13:37:06 compute-0 python3.9[76841]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:37:07 compute-0 python3.9[76991]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:37:07 compute-0 sshd-session[75999]: Connection closed by 192.168.122.30 port 34826
Oct 06 13:37:07 compute-0 sshd-session[75996]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:37:07 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Oct 06 13:37:07 compute-0 systemd[1]: session-19.scope: Consumed 6.589s CPU time.
Oct 06 13:37:07 compute-0 systemd-logind[789]: Session 19 logged out. Waiting for processes to exit.
Oct 06 13:37:07 compute-0 systemd-logind[789]: Removed session 19.
Oct 06 13:37:17 compute-0 sshd-session[77016]: Accepted publickey for zuul from 192.168.122.30 port 44792 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:37:17 compute-0 systemd-logind[789]: New session 20 of user zuul.
Oct 06 13:37:17 compute-0 systemd[1]: Started Session 20 of User zuul.
Oct 06 13:37:17 compute-0 sshd-session[77016]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:37:18 compute-0 python3.9[77169]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:37:20 compute-0 sudo[77323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blrlzrzzgaovoelcjtzjbldklzbgpujd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757839.8539808-80-194312939066712/AnsiballZ_file.py'
Oct 06 13:37:20 compute-0 sudo[77323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:20 compute-0 python3.9[77325]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:20 compute-0 sudo[77323]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:21 compute-0 sudo[77475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhwokrhjnysiikhuteqicobshqjiwplv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757840.7728338-80-95720618993514/AnsiballZ_file.py'
Oct 06 13:37:21 compute-0 sudo[77475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:21 compute-0 python3.9[77477]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:21 compute-0 sudo[77475]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:21 compute-0 sudo[77627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbyfptamyqccexdfnkljghpoddcljncx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757841.454326-113-53895979690517/AnsiballZ_stat.py'
Oct 06 13:37:21 compute-0 sudo[77627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:22 compute-0 python3.9[77629]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:22 compute-0 sudo[77627]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:22 compute-0 sudo[77750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emecollqosqwtwengnxcymywnfqyhwre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757841.454326-113-53895979690517/AnsiballZ_copy.py'
Oct 06 13:37:22 compute-0 sudo[77750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:22 compute-0 python3.9[77752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757841.454326-113-53895979690517/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=a6b2f7f443760376f8f7538b4e39aac79b5c60d2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:22 compute-0 sudo[77750]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:23 compute-0 sudo[77902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsgbvrpcecicuprrdicxkaxfkgmmdyjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757842.726627-113-131745620106683/AnsiballZ_stat.py'
Oct 06 13:37:23 compute-0 sudo[77902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:23 compute-0 python3.9[77904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:23 compute-0 sudo[77902]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:23 compute-0 sudo[78025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olcvonchzltkacoknmrcgthvvzcpnquj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757842.726627-113-131745620106683/AnsiballZ_copy.py'
Oct 06 13:37:23 compute-0 sudo[78025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:23 compute-0 python3.9[78027]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757842.726627-113-131745620106683/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=b733546c769dfc90d01f55e0c55452d8178a3aef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:23 compute-0 sudo[78025]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:24 compute-0 sudo[78177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hidwmgcmxbmksibobpkvbzctbffhtpuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757843.8744502-113-271824690477471/AnsiballZ_stat.py'
Oct 06 13:37:24 compute-0 sudo[78177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:24 compute-0 python3.9[78179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:24 compute-0 sudo[78177]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:24 compute-0 sudo[78300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coaesjxenakeqndsvxyvcepzwmytorrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757843.8744502-113-271824690477471/AnsiballZ_copy.py'
Oct 06 13:37:24 compute-0 sudo[78300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:25 compute-0 python3.9[78302]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757843.8744502-113-271824690477471/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=ddec51577ab5ff7554a81b312fe6993a7b2df1da backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:25 compute-0 sudo[78300]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:25 compute-0 sudo[78452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obzazpyenfthwirakbmelzbkwwcbaqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757845.3032358-206-193526463154553/AnsiballZ_file.py'
Oct 06 13:37:25 compute-0 sudo[78452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:25 compute-0 python3.9[78454]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:25 compute-0 sudo[78452]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:26 compute-0 sudo[78604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpazccszsefyalddxjawxuogcisxbygx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757845.9083612-206-183184356068446/AnsiballZ_file.py'
Oct 06 13:37:26 compute-0 sudo[78604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:26 compute-0 python3.9[78606]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:26 compute-0 sudo[78604]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:27 compute-0 sudo[78756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-updktmflwngjmjgneuqbzkukuifctxyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757846.683013-242-87726881242358/AnsiballZ_stat.py'
Oct 06 13:37:27 compute-0 sudo[78756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:27 compute-0 python3.9[78758]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:27 compute-0 sudo[78756]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:27 compute-0 sudo[78879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhiufwzqunwwbohrmqtelxhwnxddonms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757846.683013-242-87726881242358/AnsiballZ_copy.py'
Oct 06 13:37:27 compute-0 sudo[78879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:27 compute-0 python3.9[78881]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757846.683013-242-87726881242358/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=fbad98da0f1e5065e09642fae97715ed153d5c82 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:27 compute-0 sudo[78879]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:28 compute-0 sudo[79031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vizddyqeoxzejabsgywvpeumcpkpprpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757848.0380821-242-154334854147778/AnsiballZ_stat.py'
Oct 06 13:37:28 compute-0 sudo[79031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:28 compute-0 python3.9[79033]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:28 compute-0 sudo[79031]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:28 compute-0 sudo[79154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyfhmuxxqzgkprkbqtjcikcabcfysvvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757848.0380821-242-154334854147778/AnsiballZ_copy.py'
Oct 06 13:37:28 compute-0 sudo[79154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:29 compute-0 python3.9[79156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757848.0380821-242-154334854147778/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=e54e39a679fe96e7c8640d92987ca34bf8d5b1fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:29 compute-0 sudo[79154]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:29 compute-0 sudo[79306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyycsodrpywuuzykxywocnsmzksimagu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757849.309793-242-247273674870421/AnsiballZ_stat.py'
Oct 06 13:37:29 compute-0 sudo[79306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:29 compute-0 python3.9[79308]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:29 compute-0 sudo[79306]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:30 compute-0 sudo[79429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dabzkfsusknfsxxsfkaqbnyxlgsjefqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757849.309793-242-247273674870421/AnsiballZ_copy.py'
Oct 06 13:37:30 compute-0 sudo[79429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:30 compute-0 python3.9[79431]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757849.309793-242-247273674870421/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=d27ee355460206fc21d2d6d104d0d314e3eb9646 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:30 compute-0 sudo[79429]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:31 compute-0 sudo[79581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlyjukdohkmlqrgicjqjwoxwfcyfzhyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757850.7177777-334-250155678288640/AnsiballZ_file.py'
Oct 06 13:37:31 compute-0 sudo[79581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:31 compute-0 python3.9[79583]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:31 compute-0 sudo[79581]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:31 compute-0 sudo[79733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebrrhlzfgqnkclqjerckzsfhgykdqcib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757851.444135-334-739310876116/AnsiballZ_file.py'
Oct 06 13:37:31 compute-0 sudo[79733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:31 compute-0 python3.9[79735]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:32 compute-0 sudo[79733]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:32 compute-0 sudo[79885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekiiuyzsqpmrbagxkuhbialangrudpeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757852.1995564-367-4541979327421/AnsiballZ_stat.py'
Oct 06 13:37:32 compute-0 sudo[79885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:32 compute-0 python3.9[79887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:32 compute-0 sudo[79885]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:33 compute-0 sudo[80008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haomupigpfswbyagdptpaiedkvsxssbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757852.1995564-367-4541979327421/AnsiballZ_copy.py'
Oct 06 13:37:33 compute-0 sudo[80008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:33 compute-0 python3.9[80010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757852.1995564-367-4541979327421/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=5ba1f8b2979b8b5e225f12ebc9f4495a8a098f5b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:33 compute-0 sudo[80008]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:33 compute-0 sudo[80160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrwiynqffoqybedngdzlqksmictpiyxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757853.4871492-367-221568843550195/AnsiballZ_stat.py'
Oct 06 13:37:33 compute-0 sudo[80160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:34 compute-0 python3.9[80162]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:34 compute-0 sudo[80160]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:34 compute-0 sudo[80283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgfindtofzrmfidocrcojlcqkwkhngmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757853.4871492-367-221568843550195/AnsiballZ_copy.py'
Oct 06 13:37:34 compute-0 sudo[80283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:34 compute-0 python3.9[80285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757853.4871492-367-221568843550195/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=131915b6aaf461b4c183be8e7b25f8847704b43b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:34 compute-0 sudo[80283]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:35 compute-0 sudo[80435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agugrypxtfcfwxfpjnvvutfpnytrzeqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757854.8779757-367-276452974447444/AnsiballZ_stat.py'
Oct 06 13:37:35 compute-0 sudo[80435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:35 compute-0 python3.9[80437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:35 compute-0 sudo[80435]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:35 compute-0 sudo[80558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnxkunbwnaraunqaxokpjufvdbqfbzpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757854.8779757-367-276452974447444/AnsiballZ_copy.py'
Oct 06 13:37:35 compute-0 sudo[80558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:36 compute-0 python3.9[80560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757854.8779757-367-276452974447444/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=d716702796fbe22f17f301d26259830e4c22851d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:36 compute-0 sudo[80558]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:36 compute-0 sudo[80710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljfmlnvgerxohqcpzogvbiifurigauwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757856.2415004-461-131493371776456/AnsiballZ_file.py'
Oct 06 13:37:36 compute-0 sudo[80710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:36 compute-0 python3.9[80712]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:36 compute-0 sudo[80710]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:37 compute-0 sudo[80862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvkfbxzvberqjfssmbtksrlsvdsqdzhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757856.9467206-461-181338818805183/AnsiballZ_file.py'
Oct 06 13:37:37 compute-0 sudo[80862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:37 compute-0 python3.9[80864]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:37 compute-0 sudo[80862]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:37 compute-0 sudo[81014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrrprrlvbemizmszklyvywuayszhewpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757857.652818-492-25407778325818/AnsiballZ_stat.py'
Oct 06 13:37:37 compute-0 sudo[81014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:38 compute-0 python3.9[81016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:38 compute-0 sudo[81014]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:38 compute-0 sudo[81137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxwkrczdrhiuspqflvlzjflpxoqfbwhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757857.652818-492-25407778325818/AnsiballZ_copy.py'
Oct 06 13:37:38 compute-0 sudo[81137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:38 compute-0 python3.9[81139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757857.652818-492-25407778325818/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=61b8b774514eef0c46aa76a30b031770482ea61f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:38 compute-0 sudo[81137]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:39 compute-0 sudo[81289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsfqxjlkqrclkixfchezkfkpadrlblbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757858.91768-492-225592385914546/AnsiballZ_stat.py'
Oct 06 13:37:39 compute-0 sudo[81289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:39 compute-0 python3.9[81291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:39 compute-0 sudo[81289]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:39 compute-0 sudo[81412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsrjgkpbimajismbdtybjmdthvdiimta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757858.91768-492-225592385914546/AnsiballZ_copy.py'
Oct 06 13:37:39 compute-0 sudo[81412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:40 compute-0 python3.9[81414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757858.91768-492-225592385914546/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=131915b6aaf461b4c183be8e7b25f8847704b43b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:40 compute-0 sudo[81412]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:40 compute-0 sudo[81564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omhmktrrarkiwvwwpzffgxrlyaabxxgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757860.3175838-492-239222190047101/AnsiballZ_stat.py'
Oct 06 13:37:40 compute-0 sudo[81564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:40 compute-0 python3.9[81566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:40 compute-0 sudo[81564]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:41 compute-0 sudo[81687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbmbzqffdhhhuukrkdeqpttanhcsrzob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757860.3175838-492-239222190047101/AnsiballZ_copy.py'
Oct 06 13:37:41 compute-0 sudo[81687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:41 compute-0 python3.9[81689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757860.3175838-492-239222190047101/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=83ca6ae96e410f7d147740a283cc39e9f72cc8c9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:41 compute-0 sudo[81687]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:42 compute-0 sudo[81839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eotjdflulbaydnlqaddjmhkosfodcodj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757862.264429-619-187455117040344/AnsiballZ_file.py'
Oct 06 13:37:42 compute-0 sudo[81839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:42 compute-0 python3.9[81841]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:42 compute-0 sudo[81839]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:43 compute-0 sudo[81991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahwjbfshegestwdkknzpvzbdnhytdaan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757862.9179833-638-107183082035024/AnsiballZ_stat.py'
Oct 06 13:37:43 compute-0 sudo[81991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:43 compute-0 python3.9[81993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:43 compute-0 sudo[81991]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:43 compute-0 sudo[82114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdsialkrxhtoaxsfindijfqfyecgmhwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757862.9179833-638-107183082035024/AnsiballZ_copy.py'
Oct 06 13:37:43 compute-0 sudo[82114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:43 compute-0 python3.9[82116]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757862.9179833-638-107183082035024/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2d30bf5e4294e3b1ccba3d399c329ed6db5e66b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:43 compute-0 sudo[82114]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:44 compute-0 sudo[82266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbowkzxfdtxsgbdrwaercehlpphjxrht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757864.0958319-671-51904341005299/AnsiballZ_file.py'
Oct 06 13:37:44 compute-0 sudo[82266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:44 compute-0 python3.9[82268]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:44 compute-0 sudo[82266]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:45 compute-0 sudo[82418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnnvnvhetwbyoqwfvyjbzqeovkflzvrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757864.8280938-686-278222580621539/AnsiballZ_stat.py'
Oct 06 13:37:45 compute-0 sudo[82418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:45 compute-0 python3.9[82420]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:45 compute-0 sudo[82418]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:45 compute-0 sudo[82541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlgygdgjpzocnebdvgsuyushtcdqulqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757864.8280938-686-278222580621539/AnsiballZ_copy.py'
Oct 06 13:37:45 compute-0 sudo[82541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:45 compute-0 python3.9[82543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757864.8280938-686-278222580621539/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2d30bf5e4294e3b1ccba3d399c329ed6db5e66b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:45 compute-0 sudo[82541]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:46 compute-0 sudo[82693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxwwdbmgznusfasofxvvlsuiedfouame ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757866.1055074-721-56761259463151/AnsiballZ_file.py'
Oct 06 13:37:46 compute-0 sudo[82693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:46 compute-0 python3.9[82695]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:46 compute-0 sudo[82693]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:47 compute-0 sudo[82845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpteipmkwwjbwndmogjqqiphszhmznda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757866.8182693-739-135413069685513/AnsiballZ_stat.py'
Oct 06 13:37:47 compute-0 sudo[82845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:47 compute-0 python3.9[82847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:47 compute-0 sudo[82845]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:47 compute-0 sudo[82968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilqhjglbhqghlayehwmxkgmjnlkvaimb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757866.8182693-739-135413069685513/AnsiballZ_copy.py'
Oct 06 13:37:47 compute-0 sudo[82968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:48 compute-0 python3.9[82970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757866.8182693-739-135413069685513/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2d30bf5e4294e3b1ccba3d399c329ed6db5e66b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:48 compute-0 sudo[82968]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:48 compute-0 sudo[83120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkuvkzajwomcnnvgalcdyykdazuazkpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757868.2785974-773-84447809607145/AnsiballZ_file.py'
Oct 06 13:37:48 compute-0 sudo[83120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:48 compute-0 python3.9[83122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:48 compute-0 sudo[83120]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:49 compute-0 sudo[83272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcetpitkcmfwfsdwoxeraawxbcmwbelh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757869.0575376-790-276446174192981/AnsiballZ_stat.py'
Oct 06 13:37:49 compute-0 sudo[83272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:49 compute-0 python3.9[83274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:49 compute-0 sudo[83272]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:50 compute-0 sudo[83395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-worazwcceedsochmkvtpvhyeseinmceq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757869.0575376-790-276446174192981/AnsiballZ_copy.py'
Oct 06 13:37:50 compute-0 sudo[83395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:50 compute-0 python3.9[83397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757869.0575376-790-276446174192981/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2d30bf5e4294e3b1ccba3d399c329ed6db5e66b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:50 compute-0 sudo[83395]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:50 compute-0 sudo[83547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mshqhlgvlkwywklrqqqkrsfrgffvvjun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757870.6192813-824-116485193415238/AnsiballZ_file.py'
Oct 06 13:37:50 compute-0 sudo[83547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:51 compute-0 python3.9[83549]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:51 compute-0 sudo[83547]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:51 compute-0 sudo[83699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkmpevkijhqeenqympcqsjbrhrnfkbsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757871.3815525-842-236421672803160/AnsiballZ_stat.py'
Oct 06 13:37:51 compute-0 sudo[83699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:51 compute-0 python3.9[83701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:51 compute-0 sudo[83699]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:52 compute-0 sudo[83822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdyrugpskszkzvxxppwchijnjsbgkkyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757871.3815525-842-236421672803160/AnsiballZ_copy.py'
Oct 06 13:37:52 compute-0 sudo[83822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:52 compute-0 python3.9[83824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757871.3815525-842-236421672803160/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2d30bf5e4294e3b1ccba3d399c329ed6db5e66b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:52 compute-0 sudo[83822]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:53 compute-0 sudo[83974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtmsvuoesidjlbiaksxpsnsiwhdgmjdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757872.7862024-871-44368671229546/AnsiballZ_file.py'
Oct 06 13:37:53 compute-0 sudo[83974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:53 compute-0 python3.9[83976]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:53 compute-0 sudo[83974]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:53 compute-0 sudo[84126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbsavutsbrdgukeqntpgioxqbaqnycrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757873.4763322-879-13973917955694/AnsiballZ_stat.py'
Oct 06 13:37:53 compute-0 sudo[84126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:54 compute-0 python3.9[84128]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:54 compute-0 sudo[84126]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:54 compute-0 sudo[84249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spyawwhfmszcpwixlkzfrknpzujrymsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757873.4763322-879-13973917955694/AnsiballZ_copy.py'
Oct 06 13:37:54 compute-0 sudo[84249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:54 compute-0 python3.9[84251]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757873.4763322-879-13973917955694/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2d30bf5e4294e3b1ccba3d399c329ed6db5e66b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:54 compute-0 sudo[84249]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:55 compute-0 sudo[84401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oekvnrajcqphjmldqkvymspehhskdosg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757874.9329457-895-237402523147276/AnsiballZ_file.py'
Oct 06 13:37:55 compute-0 sudo[84401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:55 compute-0 python3.9[84403]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:37:55 compute-0 sudo[84401]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:56 compute-0 sudo[84553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wioteesfdveuphugemfynnpujinhajiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757875.6777573-903-1798115906143/AnsiballZ_stat.py'
Oct 06 13:37:56 compute-0 sudo[84553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:56 compute-0 python3.9[84555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:37:56 compute-0 sudo[84553]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:56 compute-0 sudo[84676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqhtrqkxbdeurnvrbdcsvsvuuccxlfel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757875.6777573-903-1798115906143/AnsiballZ_copy.py'
Oct 06 13:37:56 compute-0 sudo[84676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:37:56 compute-0 python3.9[84678]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757875.6777573-903-1798115906143/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2d30bf5e4294e3b1ccba3d399c329ed6db5e66b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:37:56 compute-0 sudo[84676]: pam_unix(sudo:session): session closed for user root
Oct 06 13:37:57 compute-0 sshd-session[77019]: Connection closed by 192.168.122.30 port 44792
Oct 06 13:37:57 compute-0 sshd-session[77016]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:37:57 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Oct 06 13:37:57 compute-0 systemd[1]: session-20.scope: Consumed 33.357s CPU time.
Oct 06 13:37:57 compute-0 systemd-logind[789]: Session 20 logged out. Waiting for processes to exit.
Oct 06 13:37:57 compute-0 systemd-logind[789]: Removed session 20.
Oct 06 13:37:59 compute-0 PackageKit[31588]: daemon quit
Oct 06 13:37:59 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 06 13:38:03 compute-0 sshd-session[84706]: Accepted publickey for zuul from 192.168.122.30 port 42056 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:38:03 compute-0 systemd-logind[789]: New session 21 of user zuul.
Oct 06 13:38:03 compute-0 systemd[1]: Started Session 21 of User zuul.
Oct 06 13:38:03 compute-0 sshd-session[84706]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:38:04 compute-0 python3.9[84859]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:38:05 compute-0 sudo[85013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofliwzujfyyhweejtosskudhhzfepnyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757884.6598818-48-227280901781152/AnsiballZ_file.py'
Oct 06 13:38:05 compute-0 sudo[85013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:05 compute-0 python3.9[85015]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:38:05 compute-0 sudo[85013]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:05 compute-0 sudo[85165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxfvqnygkhtobbzmbtwtfiawbwlvjqvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757885.573584-48-142352147321335/AnsiballZ_file.py'
Oct 06 13:38:05 compute-0 sudo[85165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:06 compute-0 python3.9[85167]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:38:06 compute-0 sudo[85165]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:06 compute-0 python3.9[85317]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:38:07 compute-0 sudo[85467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwjgyyuewxmmqpljmfvtanirwwjgsehp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757887.1621642-94-87109890664209/AnsiballZ_seboolean.py'
Oct 06 13:38:07 compute-0 sudo[85467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:07 compute-0 python3.9[85469]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 06 13:38:09 compute-0 sudo[85467]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:09 compute-0 sudo[85623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgjhzacauedwfnosybjypapgvubvkwed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757889.4233067-114-72945008324709/AnsiballZ_setup.py'
Oct 06 13:38:09 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct 06 13:38:09 compute-0 sudo[85623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:10 compute-0 python3.9[85625]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:38:10 compute-0 sudo[85623]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:10 compute-0 sudo[85707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuvatfdsyhphxqygjlkirjkaydkldbnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757889.4233067-114-72945008324709/AnsiballZ_dnf.py'
Oct 06 13:38:10 compute-0 sudo[85707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:11 compute-0 python3.9[85709]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:38:12 compute-0 sudo[85707]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:13 compute-0 sudo[85860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddgdszggbhrmrmlgdokuueyqrbpyloga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757892.40964-138-268464731421756/AnsiballZ_systemd.py'
Oct 06 13:38:13 compute-0 sudo[85860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:13 compute-0 python3.9[85862]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 06 13:38:13 compute-0 sudo[85860]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:14 compute-0 sudo[86015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcxtmwdvyqffwxchdskenycrcejqgbov ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759757893.6745174-154-265778185888976/AnsiballZ_edpm_nftables_snippet.py'
Oct 06 13:38:14 compute-0 sudo[86015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:14 compute-0 python3[86017]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct 06 13:38:14 compute-0 sudo[86015]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:15 compute-0 sudo[86167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwvotghwjikgibjgugmmdmtuvpqrntzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757894.685227-172-74276093762975/AnsiballZ_file.py'
Oct 06 13:38:15 compute-0 sudo[86167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:15 compute-0 python3.9[86169]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:15 compute-0 sudo[86167]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:15 compute-0 sudo[86319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncanwpyzrupnaomxdnxazbemiohogkyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757895.4593346-188-73076756051345/AnsiballZ_stat.py'
Oct 06 13:38:15 compute-0 sudo[86319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:16 compute-0 python3.9[86321]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:16 compute-0 sudo[86319]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:16 compute-0 sudo[86397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwgkueviqwcptzmdmcreehlbltsysqfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757895.4593346-188-73076756051345/AnsiballZ_file.py'
Oct 06 13:38:16 compute-0 sudo[86397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:16 compute-0 python3.9[86399]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:16 compute-0 sudo[86397]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:17 compute-0 sudo[86549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mikcvriegqtkwsfsvvwfjyfyvegukrcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757896.7583044-212-25623215736068/AnsiballZ_stat.py'
Oct 06 13:38:17 compute-0 sudo[86549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:17 compute-0 python3.9[86551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:17 compute-0 sudo[86549]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:17 compute-0 sudo[86627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuzjkmndbitbfojfczckdvfdzimcgxxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757896.7583044-212-25623215736068/AnsiballZ_file.py'
Oct 06 13:38:17 compute-0 sudo[86627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:17 compute-0 python3.9[86629]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.x279aejt recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:17 compute-0 sudo[86627]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:18 compute-0 sudo[86779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkgowpwvrdhaigzqafxhtyxsyocsqtrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757897.9615376-236-276950840225439/AnsiballZ_stat.py'
Oct 06 13:38:18 compute-0 sudo[86779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:18 compute-0 python3.9[86781]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:18 compute-0 sudo[86779]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:18 compute-0 sudo[86857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quwvacxdfavfbodudstvskhgswauxhin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757897.9615376-236-276950840225439/AnsiballZ_file.py'
Oct 06 13:38:18 compute-0 sudo[86857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:19 compute-0 python3.9[86859]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:19 compute-0 sudo[86857]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:20 compute-0 sudo[87009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnpcmdbsgaxvfxvjzbctdwemvhzduyaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757899.496736-262-97121516395213/AnsiballZ_command.py'
Oct 06 13:38:20 compute-0 sudo[87009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:20 compute-0 python3.9[87011]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:38:20 compute-0 sudo[87009]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:20 compute-0 sudo[87162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiuvsoqrnommaheqorahkceuttskcxuv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759757900.5047603-278-188065057702092/AnsiballZ_edpm_nftables_from_files.py'
Oct 06 13:38:20 compute-0 sudo[87162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:21 compute-0 python3[87164]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 06 13:38:21 compute-0 sudo[87162]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:21 compute-0 sudo[87314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmydfsstqrtenjcstbsgekknmbrfdyih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757901.3501127-294-114001426698478/AnsiballZ_stat.py'
Oct 06 13:38:21 compute-0 sudo[87314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:21 compute-0 python3.9[87316]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:21 compute-0 sudo[87314]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:22 compute-0 sudo[87439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wieblftgbhejdxjwcadnbhcvaiuempci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757901.3501127-294-114001426698478/AnsiballZ_copy.py'
Oct 06 13:38:22 compute-0 sudo[87439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:22 compute-0 python3.9[87441]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757901.3501127-294-114001426698478/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:22 compute-0 sudo[87439]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:23 compute-0 sudo[87591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-curykesoabqtbipdpzjvxzcgzpirxjzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757903.1346965-324-79331519036525/AnsiballZ_stat.py'
Oct 06 13:38:23 compute-0 sudo[87591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:23 compute-0 python3.9[87593]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:23 compute-0 sudo[87591]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:24 compute-0 sudo[87716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiilhmtnmdeiuabvniznnytlqminjitx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757903.1346965-324-79331519036525/AnsiballZ_copy.py'
Oct 06 13:38:24 compute-0 sudo[87716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:24 compute-0 python3.9[87718]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757903.1346965-324-79331519036525/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:24 compute-0 sudo[87716]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:25 compute-0 sudo[87868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heosayvqnysbmcrihxboictldmikecma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757904.6368668-354-230581232358980/AnsiballZ_stat.py'
Oct 06 13:38:25 compute-0 sudo[87868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:25 compute-0 python3.9[87870]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:25 compute-0 sudo[87868]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:25 compute-0 sudo[87993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjenqlfaaqyqexqwohypxmpagjzfjnre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757904.6368668-354-230581232358980/AnsiballZ_copy.py'
Oct 06 13:38:25 compute-0 sudo[87993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:25 compute-0 python3.9[87995]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757904.6368668-354-230581232358980/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:25 compute-0 sudo[87993]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:26 compute-0 sudo[88145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwvlvripnzmwxtyttgwqhkovsxohlika ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757906.0572436-384-5289457088254/AnsiballZ_stat.py'
Oct 06 13:38:26 compute-0 sudo[88145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:26 compute-0 python3.9[88147]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:26 compute-0 sudo[88145]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:27 compute-0 sudo[88270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spqxlumgyoweqicwrusvfmttmowmmklu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757906.0572436-384-5289457088254/AnsiballZ_copy.py'
Oct 06 13:38:27 compute-0 sudo[88270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:27 compute-0 python3.9[88272]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757906.0572436-384-5289457088254/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:27 compute-0 sudo[88270]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:27 compute-0 sudo[88422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmnflsgkbnttyjoclkmiojxdxtlqjopr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757907.5473733-414-142134976032267/AnsiballZ_stat.py'
Oct 06 13:38:28 compute-0 sudo[88422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:28 compute-0 python3.9[88424]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:28 compute-0 sudo[88422]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:28 compute-0 sudo[88547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aupdkavuotxemdhyapopnslvwlcsljkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757907.5473733-414-142134976032267/AnsiballZ_copy.py'
Oct 06 13:38:28 compute-0 sudo[88547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:28 compute-0 python3.9[88549]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759757907.5473733-414-142134976032267/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:28 compute-0 sudo[88547]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:29 compute-0 sudo[88699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-delwnlmczlwpzxmqxjtyukofxeitfdbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757909.1602452-444-103964165540499/AnsiballZ_file.py'
Oct 06 13:38:29 compute-0 sudo[88699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:29 compute-0 python3.9[88701]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:29 compute-0 sudo[88699]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:30 compute-0 sudo[88851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpahpsyoocjzrdwgihqetiadhgdrupur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757909.9500515-460-280275435470158/AnsiballZ_command.py'
Oct 06 13:38:30 compute-0 sudo[88851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:30 compute-0 python3.9[88853]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:38:30 compute-0 sudo[88851]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:31 compute-0 sudo[89006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byiajmzwpxcqpkbttcofwqddmytksrnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757910.765402-476-41601994233477/AnsiballZ_blockinfile.py'
Oct 06 13:38:31 compute-0 sudo[89006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:31 compute-0 python3.9[89008]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:31 compute-0 sudo[89006]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:32 compute-0 sudo[89158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlthwvsgzvyeccfwzodcxufenloaeooq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757911.7891464-494-198372146884956/AnsiballZ_command.py'
Oct 06 13:38:32 compute-0 sudo[89158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:32 compute-0 python3.9[89160]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:38:32 compute-0 sudo[89158]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:32 compute-0 sudo[89311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avxofyexmdawdvacmouezqqhsjsikoha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757912.5973716-510-169744980494635/AnsiballZ_stat.py'
Oct 06 13:38:32 compute-0 sudo[89311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:33 compute-0 python3.9[89313]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:38:33 compute-0 sudo[89311]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:33 compute-0 sudo[89465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-realahoaevitnqbjkmjyqxwydtimsvxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757913.3426309-526-36878073318549/AnsiballZ_command.py'
Oct 06 13:38:33 compute-0 sudo[89465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:33 compute-0 python3.9[89467]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:38:33 compute-0 sudo[89465]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:34 compute-0 sudo[89620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsuanaundfwepsmxdleqvnoswesgvttt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757914.1622899-542-108723780421674/AnsiballZ_file.py'
Oct 06 13:38:34 compute-0 sudo[89620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:34 compute-0 python3.9[89622]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:34 compute-0 sudo[89620]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:35 compute-0 python3.9[89772]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:38:36 compute-0 sudo[89923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcbuzytyhauskbvlsduusylatcvjwypn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757916.6080034-622-47895944397757/AnsiballZ_command.py'
Oct 06 13:38:36 compute-0 sudo[89923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:37 compute-0 python3.9[89925]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:38:37 compute-0 ovs-vsctl[89926]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct 06 13:38:37 compute-0 sudo[89923]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:37 compute-0 sudo[90076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpcvclroppcxgnahmqhbqiticjoswqvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757917.4075532-640-231553091803984/AnsiballZ_command.py'
Oct 06 13:38:37 compute-0 sudo[90076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:37 compute-0 python3.9[90078]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:38:37 compute-0 sudo[90076]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:38 compute-0 sudo[90231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwnujzmtdsvdqhyfhjxjgohwoozsknmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757918.1792388-656-138323150253333/AnsiballZ_command.py'
Oct 06 13:38:38 compute-0 sudo[90231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:38 compute-0 python3.9[90233]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:38:38 compute-0 ovs-vsctl[90234]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct 06 13:38:38 compute-0 sudo[90231]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:39 compute-0 python3.9[90384]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:38:40 compute-0 sudo[90536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fisggfpqjlxodhxfnkwswehomasvsfgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757920.0109878-690-27140447385996/AnsiballZ_file.py'
Oct 06 13:38:40 compute-0 sudo[90536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:40 compute-0 python3.9[90538]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:38:40 compute-0 sudo[90536]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:41 compute-0 sudo[90688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fagdhlfsimehxbrbjanomkznygnvkwsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757920.7824073-706-159395051225113/AnsiballZ_stat.py'
Oct 06 13:38:41 compute-0 sudo[90688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:41 compute-0 python3.9[90690]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:41 compute-0 sudo[90688]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:41 compute-0 sudo[90766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efuttfbuinunpssdtzsakclteyfvybvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757920.7824073-706-159395051225113/AnsiballZ_file.py'
Oct 06 13:38:41 compute-0 sudo[90766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:41 compute-0 python3.9[90768]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:38:41 compute-0 sudo[90766]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:42 compute-0 sudo[90918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbdllbyyzfilgsbqwikgritwzrsbotmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757922.149773-706-10848333732803/AnsiballZ_stat.py'
Oct 06 13:38:42 compute-0 sudo[90918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:42 compute-0 python3.9[90920]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:42 compute-0 sudo[90918]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:42 compute-0 sudo[90996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvolrgrgmglxwnynkcwgbsvsyntbheuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757922.149773-706-10848333732803/AnsiballZ_file.py'
Oct 06 13:38:42 compute-0 sudo[90996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:43 compute-0 python3.9[90998]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:38:43 compute-0 sudo[90996]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:43 compute-0 sudo[91148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhitaqjhrgwnitdlkhutpreaninulvby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757923.41998-752-250553092700265/AnsiballZ_file.py'
Oct 06 13:38:43 compute-0 sudo[91148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:43 compute-0 python3.9[91150]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:44 compute-0 sudo[91148]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:44 compute-0 sudo[91300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygupzehvnilooptmxvbhitmhmzzmkpag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757924.1673748-768-23780503010823/AnsiballZ_stat.py'
Oct 06 13:38:44 compute-0 sudo[91300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:44 compute-0 python3.9[91302]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:44 compute-0 sudo[91300]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:44 compute-0 sudo[91378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjafprjhmlkvdyreyvnhdoaryenfhmft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757924.1673748-768-23780503010823/AnsiballZ_file.py'
Oct 06 13:38:44 compute-0 sudo[91378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:45 compute-0 python3.9[91380]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:45 compute-0 sudo[91378]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:45 compute-0 sudo[91530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldysyeawvyvnrzkrzhawlrnamnblylgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757925.400947-792-67114005469721/AnsiballZ_stat.py'
Oct 06 13:38:45 compute-0 sudo[91530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:45 compute-0 python3.9[91532]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:45 compute-0 sudo[91530]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:46 compute-0 sudo[91608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-natziibkdkhtupxkyyjykoqrsmgdvtsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757925.400947-792-67114005469721/AnsiballZ_file.py'
Oct 06 13:38:46 compute-0 sudo[91608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:46 compute-0 python3.9[91610]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:46 compute-0 sudo[91608]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:46 compute-0 sudo[91760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxicxfkgsusrxvpxysrjxqhtapuqkjjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757926.6562648-816-143099865637611/AnsiballZ_systemd.py'
Oct 06 13:38:46 compute-0 sudo[91760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:47 compute-0 python3.9[91762]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:38:47 compute-0 systemd[1]: Reloading.
Oct 06 13:38:47 compute-0 systemd-rc-local-generator[91784]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:38:47 compute-0 systemd-sysv-generator[91791]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:38:47 compute-0 sudo[91760]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:48 compute-0 sudo[91950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ludqsfwgznlqsgimwirvkwdhpdrjpqyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757927.783126-832-115619248177914/AnsiballZ_stat.py'
Oct 06 13:38:48 compute-0 sudo[91950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:48 compute-0 python3.9[91952]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:48 compute-0 sudo[91950]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:48 compute-0 sudo[92028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmffcfnkcyfjxxohtyqvtcjsfoaloghe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757927.783126-832-115619248177914/AnsiballZ_file.py'
Oct 06 13:38:48 compute-0 sudo[92028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:48 compute-0 python3.9[92030]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:48 compute-0 sudo[92028]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:49 compute-0 sudo[92180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rncfyehtotbgmsraxcmrdxbmmeayxmtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757928.9854805-856-45644866799755/AnsiballZ_stat.py'
Oct 06 13:38:49 compute-0 sudo[92180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:49 compute-0 python3.9[92182]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:49 compute-0 sudo[92180]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:49 compute-0 sudo[92258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-appshknnuqpgvxjovhfdnbsijsrzsetg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757928.9854805-856-45644866799755/AnsiballZ_file.py'
Oct 06 13:38:49 compute-0 sudo[92258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:50 compute-0 python3.9[92260]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:50 compute-0 sudo[92258]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:50 compute-0 sudo[92410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqtifsbpvmjymkjfmsekibbhvrxfbhpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757930.3600426-880-124918027987643/AnsiballZ_systemd.py'
Oct 06 13:38:50 compute-0 sudo[92410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:51 compute-0 python3.9[92412]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:38:51 compute-0 systemd[1]: Reloading.
Oct 06 13:38:51 compute-0 systemd-rc-local-generator[92438]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:38:51 compute-0 systemd-sysv-generator[92443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:38:51 compute-0 systemd[1]: Starting Create netns directory...
Oct 06 13:38:51 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 06 13:38:51 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 06 13:38:51 compute-0 systemd[1]: Finished Create netns directory.
Oct 06 13:38:51 compute-0 sudo[92410]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:52 compute-0 sudo[92602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozlvmzqtqueitiymomlhnqfbqiarslyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757931.7231486-900-123067791204481/AnsiballZ_file.py'
Oct 06 13:38:52 compute-0 sudo[92602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:52 compute-0 python3.9[92604]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:38:52 compute-0 sudo[92602]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:52 compute-0 sudo[92754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjloaivphogrezwjsrkdjyospvgkttjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757932.6251752-916-14981040387373/AnsiballZ_stat.py'
Oct 06 13:38:52 compute-0 sudo[92754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:53 compute-0 python3.9[92756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:53 compute-0 sudo[92754]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:53 compute-0 sudo[92877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijnfiwabyxaqeevuqhbthktqiwgrmvfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757932.6251752-916-14981040387373/AnsiballZ_copy.py'
Oct 06 13:38:53 compute-0 sudo[92877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:53 compute-0 python3.9[92879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757932.6251752-916-14981040387373/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:38:53 compute-0 sudo[92877]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:54 compute-0 sudo[93029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-telgzmlyecstkyuovtvdohbipmortiya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757934.1911705-950-15621668771131/AnsiballZ_file.py'
Oct 06 13:38:54 compute-0 sudo[93029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:54 compute-0 python3.9[93031]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:38:54 compute-0 sudo[93029]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:55 compute-0 sudo[93181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsawtmwxbwlygwiekchenskhwxeqqdsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757934.9832232-966-121510115968745/AnsiballZ_stat.py'
Oct 06 13:38:55 compute-0 sudo[93181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:55 compute-0 python3.9[93183]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:38:55 compute-0 sudo[93181]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:56 compute-0 sudo[93304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mksfxpkkwlznfecbgrzztteiwovxmjks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757934.9832232-966-121510115968745/AnsiballZ_copy.py'
Oct 06 13:38:56 compute-0 sudo[93304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:56 compute-0 python3.9[93306]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757934.9832232-966-121510115968745/.source.json _original_basename=.b6dh5axj follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:56 compute-0 sudo[93304]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:57 compute-0 sudo[93456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuvlggsxnwvbzbvlniyghgzrehflmizm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757936.5988867-996-240703590751048/AnsiballZ_file.py'
Oct 06 13:38:57 compute-0 sudo[93456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:57 compute-0 python3.9[93458]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:38:57 compute-0 sudo[93456]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:57 compute-0 sudo[93608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yclgevppftmygxauslyprlhhoeobgywm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757937.5496752-1012-257248346345067/AnsiballZ_stat.py'
Oct 06 13:38:57 compute-0 sudo[93608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:58 compute-0 sudo[93608]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:58 compute-0 sudo[93731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqbyiyzoqxfxcwowqxnjbkuzyrrhazux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757937.5496752-1012-257248346345067/AnsiballZ_copy.py'
Oct 06 13:38:58 compute-0 sudo[93731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:58 compute-0 sudo[93731]: pam_unix(sudo:session): session closed for user root
Oct 06 13:38:59 compute-0 sudo[93883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khrhtzrnzrcjlutvcfpxeoykfpfchxcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757939.1062086-1046-179092316077212/AnsiballZ_container_config_data.py'
Oct 06 13:38:59 compute-0 sudo[93883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:38:59 compute-0 python3.9[93885]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct 06 13:38:59 compute-0 sudo[93883]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:00 compute-0 sudo[94035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ranyagaycsqwzwdknoesejuqioxoipbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757940.0798914-1064-69772388802970/AnsiballZ_container_config_hash.py'
Oct 06 13:39:00 compute-0 sudo[94035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:00 compute-0 python3.9[94037]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 06 13:39:00 compute-0 sudo[94035]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:01 compute-0 sudo[94187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpvtqiqkohxyyuhrjvztmifqxdevvmjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757941.2100346-1082-269691696272877/AnsiballZ_podman_container_info.py'
Oct 06 13:39:01 compute-0 sudo[94187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:01 compute-0 python3.9[94189]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 06 13:39:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:39:01 compute-0 sudo[94187]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:02 compute-0 sudo[94351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbghncvdentlmpngoslrldtdkbbieguj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759757942.4150686-1108-170250432116855/AnsiballZ_edpm_container_manage.py'
Oct 06 13:39:02 compute-0 sudo[94351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:03 compute-0 python3[94353]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 06 13:39:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:39:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:39:03 compute-0 podman[94390]: 2025-10-06 13:39:03.427174365 +0000 UTC m=+0.059389758 container create 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Oct 06 13:39:03 compute-0 podman[94390]: 2025-10-06 13:39:03.397761332 +0000 UTC m=+0.029976755 image pull 2c4150b67f2803f56f4e9488a6a1d434787a7813c9b1fcb4aed975e77b886b52 38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Oct 06 13:39:03 compute-0 python3[94353]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Oct 06 13:39:03 compute-0 sudo[94351]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:04 compute-0 sudo[94578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djvveaqioxkhshdoooddnjaomgyqzklf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757943.8171911-1124-57928025194894/AnsiballZ_stat.py'
Oct 06 13:39:04 compute-0 sudo[94578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 06 13:39:04 compute-0 python3.9[94580]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:39:04 compute-0 sudo[94578]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:04 compute-0 sudo[94732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvtxfmfebttbdrjcoqaszanljvcnggkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757944.6691427-1142-48164969792326/AnsiballZ_file.py'
Oct 06 13:39:04 compute-0 sudo[94732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:05 compute-0 python3.9[94734]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:39:05 compute-0 sudo[94732]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:05 compute-0 sudo[94808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqrmrpvlgvsqddpbvennlekqsokklvqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757944.6691427-1142-48164969792326/AnsiballZ_stat.py'
Oct 06 13:39:05 compute-0 sudo[94808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:05 compute-0 python3.9[94810]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:39:05 compute-0 sudo[94808]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:06 compute-0 sudo[94959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkirajvlisyobakvwxdxwbrnkxnwycin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757945.9534962-1142-134899504527433/AnsiballZ_copy.py'
Oct 06 13:39:06 compute-0 sudo[94959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:06 compute-0 python3.9[94961]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759757945.9534962-1142-134899504527433/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:39:06 compute-0 sudo[94959]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:06 compute-0 sudo[95035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipwknmlytmqxdvbhziaryqngrkuitwik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757945.9534962-1142-134899504527433/AnsiballZ_systemd.py'
Oct 06 13:39:06 compute-0 sudo[95035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:07 compute-0 python3.9[95037]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:39:07 compute-0 systemd[1]: Reloading.
Oct 06 13:39:07 compute-0 systemd-rc-local-generator[95065]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:39:07 compute-0 systemd-sysv-generator[95069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:39:07 compute-0 sudo[95035]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:07 compute-0 sudo[95147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzhfqowkgotopyekisniwswollsjoyib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757945.9534962-1142-134899504527433/AnsiballZ_systemd.py'
Oct 06 13:39:07 compute-0 sudo[95147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:08 compute-0 python3.9[95149]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:39:08 compute-0 systemd[1]: Reloading.
Oct 06 13:39:08 compute-0 systemd-rc-local-generator[95176]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:39:08 compute-0 systemd-sysv-generator[95181]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:39:08 compute-0 systemd[1]: Starting ovn_controller container...
Oct 06 13:39:08 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct 06 13:39:08 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:39:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a21701da4224e1a316c3cfcca2440e2719544c19747d7949528b9dbed3cfae6/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 06 13:39:08 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b.
Oct 06 13:39:08 compute-0 podman[95190]: 2025-10-06 13:39:08.679393983 +0000 UTC m=+0.189506908 container init 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Oct 06 13:39:08 compute-0 ovn_controller[95205]: + sudo -E kolla_set_configs
Oct 06 13:39:08 compute-0 podman[95190]: 2025-10-06 13:39:08.717246911 +0000 UTC m=+0.227359766 container start 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 06 13:39:08 compute-0 edpm-start-podman-container[95190]: ovn_controller
Oct 06 13:39:08 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 06 13:39:08 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 06 13:39:08 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 06 13:39:08 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 06 13:39:08 compute-0 edpm-start-podman-container[95189]: Creating additional drop-in dependency for "ovn_controller" (20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b)
Oct 06 13:39:08 compute-0 podman[95212]: 2025-10-06 13:39:08.809752348 +0000 UTC m=+0.080319174 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:39:08 compute-0 systemd[95246]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 06 13:39:08 compute-0 systemd[1]: 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b-38c5c170ebd5d21d.service: Main process exited, code=exited, status=1/FAILURE
Oct 06 13:39:08 compute-0 systemd[1]: 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b-38c5c170ebd5d21d.service: Failed with result 'exit-code'.
Oct 06 13:39:08 compute-0 systemd[1]: Reloading.
Oct 06 13:39:08 compute-0 systemd-sysv-generator[95297]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:39:08 compute-0 systemd-rc-local-generator[95292]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:39:08 compute-0 systemd[95246]: Queued start job for default target Main User Target.
Oct 06 13:39:08 compute-0 systemd[95246]: Created slice User Application Slice.
Oct 06 13:39:08 compute-0 systemd[95246]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 06 13:39:08 compute-0 systemd[95246]: Started Daily Cleanup of User's Temporary Directories.
Oct 06 13:39:08 compute-0 systemd[95246]: Reached target Paths.
Oct 06 13:39:08 compute-0 systemd[95246]: Reached target Timers.
Oct 06 13:39:08 compute-0 systemd[95246]: Starting D-Bus User Message Bus Socket...
Oct 06 13:39:08 compute-0 systemd[95246]: Starting Create User's Volatile Files and Directories...
Oct 06 13:39:08 compute-0 systemd[95246]: Listening on D-Bus User Message Bus Socket.
Oct 06 13:39:08 compute-0 systemd[95246]: Reached target Sockets.
Oct 06 13:39:08 compute-0 systemd[95246]: Finished Create User's Volatile Files and Directories.
Oct 06 13:39:08 compute-0 systemd[95246]: Reached target Basic System.
Oct 06 13:39:08 compute-0 systemd[95246]: Reached target Main User Target.
Oct 06 13:39:08 compute-0 systemd[95246]: Startup finished in 155ms.
Oct 06 13:39:09 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 06 13:39:09 compute-0 systemd[1]: Started ovn_controller container.
Oct 06 13:39:09 compute-0 systemd[1]: Started Session c1 of User root.
Oct 06 13:39:09 compute-0 sudo[95147]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:09 compute-0 ovn_controller[95205]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 06 13:39:09 compute-0 ovn_controller[95205]: INFO:__main__:Validating config file
Oct 06 13:39:09 compute-0 ovn_controller[95205]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 06 13:39:09 compute-0 ovn_controller[95205]: INFO:__main__:Writing out command to execute
Oct 06 13:39:09 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Oct 06 13:39:09 compute-0 ovn_controller[95205]: ++ cat /run_command
Oct 06 13:39:09 compute-0 ovn_controller[95205]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 06 13:39:09 compute-0 ovn_controller[95205]: + ARGS=
Oct 06 13:39:09 compute-0 ovn_controller[95205]: + sudo kolla_copy_cacerts
Oct 06 13:39:09 compute-0 systemd[1]: Started Session c2 of User root.
Oct 06 13:39:09 compute-0 ovn_controller[95205]: + [[ ! -n '' ]]
Oct 06 13:39:09 compute-0 ovn_controller[95205]: + . kolla_extend_start
Oct 06 13:39:09 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Oct 06 13:39:09 compute-0 ovn_controller[95205]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct 06 13:39:09 compute-0 ovn_controller[95205]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 06 13:39:09 compute-0 ovn_controller[95205]: + umask 0022
Oct 06 13:39:09 compute-0 ovn_controller[95205]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Oct 06 13:39:09 compute-0 ovn_controller[95205]: 2025-10-06T13:39:09Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Oct 06 13:39:09 compute-0 NetworkManager[52035]: <info>  [1759757949.3090] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct 06 13:39:09 compute-0 NetworkManager[52035]: <info>  [1759757949.3102] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 06 13:39:09 compute-0 NetworkManager[52035]: <info>  [1759757949.3117] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct 06 13:39:09 compute-0 NetworkManager[52035]: <info>  [1759757949.3125] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct 06 13:39:09 compute-0 NetworkManager[52035]: <info>  [1759757949.3131] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 06 13:39:09 compute-0 kernel: br-int: entered promiscuous mode
Oct 06 13:39:09 compute-0 systemd-udevd[95362]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 13:39:09 compute-0 sudo[95467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyhepbkgpkbtbqlyxkexihelruwwighu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757949.3428097-1198-122540445592569/AnsiballZ_command.py'
Oct 06 13:39:09 compute-0 sudo[95467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:09 compute-0 python3.9[95469]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:39:09 compute-0 ovs-vsctl[95470]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct 06 13:39:10 compute-0 sudo[95467]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00025|main|INFO|OVS feature set changed, force recompute.
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00029|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00030|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00032|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00033|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00034|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00035|features|INFO|OVS Feature: meter_support, state: supported
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00036|features|INFO|OVS Feature: group_support, state: supported
Oct 06 13:39:10 compute-0 ovn_controller[95205]: 2025-10-06T13:39:10Z|00037|main|INFO|OVS feature set changed, force recompute.
Oct 06 13:39:10 compute-0 NetworkManager[52035]: <info>  [1759757950.3309] manager: (ovn-a988b0-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct 06 13:39:10 compute-0 NetworkManager[52035]: <info>  [1759757950.3322] manager: (ovn-7f5c9d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Oct 06 13:39:10 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Oct 06 13:39:10 compute-0 NetworkManager[52035]: <info>  [1759757950.3623] device (genev_sys_6081): carrier: link connected
Oct 06 13:39:10 compute-0 NetworkManager[52035]: <info>  [1759757950.3631] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Oct 06 13:39:10 compute-0 sudo[95623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atmdqskjhpgrdzibestbqzstejsroqgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757950.2385664-1214-163866212166908/AnsiballZ_command.py'
Oct 06 13:39:10 compute-0 sudo[95623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:10 compute-0 python3.9[95625]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:39:10 compute-0 ovs-vsctl[95627]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct 06 13:39:10 compute-0 sudo[95623]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:11 compute-0 sudo[95778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrxfxoiwhijypjofotlxfnwmjcurfelr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757951.3091464-1242-75916632312131/AnsiballZ_command.py'
Oct 06 13:39:11 compute-0 sudo[95778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:11 compute-0 python3.9[95780]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:39:11 compute-0 ovs-vsctl[95781]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct 06 13:39:11 compute-0 sudo[95778]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:12 compute-0 sshd-session[84709]: Connection closed by 192.168.122.30 port 42056
Oct 06 13:39:12 compute-0 sshd-session[84706]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:39:12 compute-0 systemd-logind[789]: Session 21 logged out. Waiting for processes to exit.
Oct 06 13:39:12 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Oct 06 13:39:12 compute-0 systemd[1]: session-21.scope: Consumed 53.749s CPU time.
Oct 06 13:39:12 compute-0 systemd-logind[789]: Removed session 21.
Oct 06 13:39:17 compute-0 sshd-session[95806]: Accepted publickey for zuul from 192.168.122.30 port 60774 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:39:17 compute-0 systemd-logind[789]: New session 23 of user zuul.
Oct 06 13:39:17 compute-0 systemd[1]: Started Session 23 of User zuul.
Oct 06 13:39:17 compute-0 sshd-session[95806]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:39:18 compute-0 python3.9[95959]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:39:19 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 06 13:39:19 compute-0 systemd[95246]: Activating special unit Exit the Session...
Oct 06 13:39:19 compute-0 systemd[95246]: Stopped target Main User Target.
Oct 06 13:39:19 compute-0 systemd[95246]: Stopped target Basic System.
Oct 06 13:39:19 compute-0 systemd[95246]: Stopped target Paths.
Oct 06 13:39:19 compute-0 systemd[95246]: Stopped target Sockets.
Oct 06 13:39:19 compute-0 systemd[95246]: Stopped target Timers.
Oct 06 13:39:19 compute-0 systemd[95246]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 06 13:39:19 compute-0 systemd[95246]: Closed D-Bus User Message Bus Socket.
Oct 06 13:39:19 compute-0 systemd[95246]: Stopped Create User's Volatile Files and Directories.
Oct 06 13:39:19 compute-0 systemd[95246]: Removed slice User Application Slice.
Oct 06 13:39:19 compute-0 systemd[95246]: Reached target Shutdown.
Oct 06 13:39:19 compute-0 systemd[95246]: Finished Exit the Session.
Oct 06 13:39:19 compute-0 systemd[95246]: Reached target Exit the Session.
Oct 06 13:39:19 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 06 13:39:19 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 06 13:39:19 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 06 13:39:19 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 06 13:39:19 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 06 13:39:19 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 06 13:39:19 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 06 13:39:19 compute-0 sudo[96116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgcezgnkobkarhaxgfyadvcyfeltxjch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757959.5102055-48-153899633873314/AnsiballZ_file.py'
Oct 06 13:39:19 compute-0 sudo[96116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:20 compute-0 python3.9[96118]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:20 compute-0 sudo[96116]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:20 compute-0 sudo[96268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkcqebzszopxyjapxggbrggipqcqydxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757960.3580613-48-50368708333497/AnsiballZ_file.py'
Oct 06 13:39:20 compute-0 sudo[96268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:20 compute-0 python3.9[96270]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:20 compute-0 sudo[96268]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:21 compute-0 sudo[96420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udlhvkojjuhotiualzokspcwmqrxiurd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757961.0792277-48-61256253314828/AnsiballZ_file.py'
Oct 06 13:39:21 compute-0 sudo[96420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:21 compute-0 python3.9[96422]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:21 compute-0 sudo[96420]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:22 compute-0 sudo[96572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmjxhrvsbgyexcdjycatbjwliardhpmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757961.7729647-48-223121241864039/AnsiballZ_file.py'
Oct 06 13:39:22 compute-0 sudo[96572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:22 compute-0 python3.9[96574]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:22 compute-0 sudo[96572]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:22 compute-0 sudo[96724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siajuiblzhzpvulmfitjafckjodaiwnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757962.4553041-48-91141064880072/AnsiballZ_file.py'
Oct 06 13:39:22 compute-0 sudo[96724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:23 compute-0 python3.9[96726]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:23 compute-0 sudo[96724]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:23 compute-0 python3.9[96876]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:39:24 compute-0 sudo[97026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxfgjtbsareyduopnxtghqdjweonotxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757964.0181062-136-71818331786314/AnsiballZ_seboolean.py'
Oct 06 13:39:24 compute-0 sudo[97026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:24 compute-0 python3.9[97028]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 06 13:39:25 compute-0 sudo[97026]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:26 compute-0 python3.9[97178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:26 compute-0 python3.9[97299]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757965.4457371-152-252230464655970/.source follow=False _original_basename=haproxy.j2 checksum=c79297b85e162ecfae922e6224a3e67813774089 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:27 compute-0 python3.9[97449]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:28 compute-0 python3.9[97570]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757967.0115347-182-58001011422600/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:28 compute-0 sudo[97721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coqqteawmangmifsiykuzzxvlbzpkcgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757968.430556-216-267257228429697/AnsiballZ_setup.py'
Oct 06 13:39:28 compute-0 sudo[97721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:28 compute-0 python3.9[97723]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:39:29 compute-0 sudo[97721]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:29 compute-0 sudo[97805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbjwjwiumfuqhsfijyvbbngdkwgxwkig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757968.430556-216-267257228429697/AnsiballZ_dnf.py'
Oct 06 13:39:29 compute-0 sudo[97805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:29 compute-0 python3.9[97807]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:39:30 compute-0 sudo[97805]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:31 compute-0 sudo[97958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xifuisycqgirdabozavyseamcedmymud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757971.1775463-240-249766731138500/AnsiballZ_systemd.py'
Oct 06 13:39:31 compute-0 sudo[97958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:32 compute-0 python3.9[97960]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 06 13:39:33 compute-0 sudo[97958]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:33 compute-0 python3.9[98113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:34 compute-0 python3.9[98234]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757973.4060447-256-12720410956605/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:35 compute-0 python3.9[98384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:35 compute-0 python3.9[98505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757974.7977657-256-13885506030857/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:37 compute-0 python3.9[98655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:37 compute-0 python3.9[98776]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757976.6785653-344-30041019424801/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:38 compute-0 python3.9[98926]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:39 compute-0 python3.9[99047]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757977.9236562-344-169208350083308/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:39 compute-0 ovn_controller[95205]: 2025-10-06T13:39:39Z|00038|memory|INFO|17612 kB peak resident set size after 29.9 seconds
Oct 06 13:39:39 compute-0 ovn_controller[95205]: 2025-10-06T13:39:39Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Oct 06 13:39:39 compute-0 podman[99048]: 2025-10-06 13:39:39.203786227 +0000 UTC m=+0.143837373 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 06 13:39:39 compute-0 python3.9[99223]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:39:40 compute-0 sudo[99375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enjeokkcojquyimoxbrpgthivhomihoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757980.086189-420-242270490843133/AnsiballZ_file.py'
Oct 06 13:39:40 compute-0 sudo[99375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:40 compute-0 python3.9[99377]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:40 compute-0 sudo[99375]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:41 compute-0 sudo[99527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suaeyqcbxeagduagaxnltyflvaddenkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757980.9192138-436-274486033020736/AnsiballZ_stat.py'
Oct 06 13:39:41 compute-0 sudo[99527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:41 compute-0 python3.9[99529]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:41 compute-0 sudo[99527]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:41 compute-0 sudo[99605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbubqhmmwovqawqnaqiprgzswrwfhaeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757980.9192138-436-274486033020736/AnsiballZ_file.py'
Oct 06 13:39:41 compute-0 sudo[99605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:41 compute-0 python3.9[99607]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:41 compute-0 sudo[99605]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:42 compute-0 sudo[99757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsrktahqfwlrofzhubzykusoopbdjbzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757982.1594567-436-279530899900254/AnsiballZ_stat.py'
Oct 06 13:39:42 compute-0 sudo[99757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:42 compute-0 python3.9[99759]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:42 compute-0 sudo[99757]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:43 compute-0 sudo[99835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyajrhccatiyrvjofocspjmzsibwfobg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757982.1594567-436-279530899900254/AnsiballZ_file.py'
Oct 06 13:39:43 compute-0 sudo[99835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:43 compute-0 python3.9[99837]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:43 compute-0 sudo[99835]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:43 compute-0 sudo[99987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iazbrviqyjgywuuffkpjqouhoxdrmrht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757983.464224-482-277022135793164/AnsiballZ_file.py'
Oct 06 13:39:43 compute-0 sudo[99987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:44 compute-0 python3.9[99989]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:39:44 compute-0 sudo[99987]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:44 compute-0 sudo[100139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixzvdcpvilbyabnzckjitpsupzrdmbif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757984.228399-498-279036415626103/AnsiballZ_stat.py'
Oct 06 13:39:44 compute-0 sudo[100139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:44 compute-0 python3.9[100141]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:44 compute-0 sudo[100139]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:45 compute-0 sudo[100217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyszzlrwismgkixbuyqpcymnrylhahks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757984.228399-498-279036415626103/AnsiballZ_file.py'
Oct 06 13:39:45 compute-0 sudo[100217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:45 compute-0 python3.9[100219]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:39:45 compute-0 sudo[100217]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:45 compute-0 sudo[100369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpbsuaqjrnhyylzjhrknfvhiensonkrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757985.530059-522-64036590027484/AnsiballZ_stat.py'
Oct 06 13:39:45 compute-0 sudo[100369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:46 compute-0 python3.9[100371]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:46 compute-0 sudo[100369]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:46 compute-0 sudo[100447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfouowdbxarvzxajaeqszpxjbdbxruez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757985.530059-522-64036590027484/AnsiballZ_file.py'
Oct 06 13:39:46 compute-0 sudo[100447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:46 compute-0 python3.9[100449]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:39:46 compute-0 sudo[100447]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:47 compute-0 sudo[100599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykfsvygobtyeowebziqnzxrofujalmsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757986.7958705-546-218729814549171/AnsiballZ_systemd.py'
Oct 06 13:39:47 compute-0 sudo[100599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:47 compute-0 python3.9[100601]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:39:47 compute-0 systemd[1]: Reloading.
Oct 06 13:39:47 compute-0 systemd-sysv-generator[100632]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:39:47 compute-0 systemd-rc-local-generator[100626]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:39:47 compute-0 sudo[100599]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:48 compute-0 sudo[100788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzkrjimlvlhxdfexhfldgtontalyfsrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757988.1452928-562-76988061325232/AnsiballZ_stat.py'
Oct 06 13:39:48 compute-0 sudo[100788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:48 compute-0 python3.9[100790]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:48 compute-0 sudo[100788]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:49 compute-0 sudo[100866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkqxeazgdjokjbcnryyjqykmpquewyhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757988.1452928-562-76988061325232/AnsiballZ_file.py'
Oct 06 13:39:49 compute-0 sudo[100866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:49 compute-0 python3.9[100868]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:39:49 compute-0 sudo[100866]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:49 compute-0 sudo[101018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdljxlhvqcsbmzneqlcdknkdpqolmrkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757989.4885123-586-234037562732189/AnsiballZ_stat.py'
Oct 06 13:39:49 compute-0 sudo[101018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:50 compute-0 python3.9[101020]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:50 compute-0 sudo[101018]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:50 compute-0 sudo[101096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pektoaziickymipwjozdkpgjkrajrver ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757989.4885123-586-234037562732189/AnsiballZ_file.py'
Oct 06 13:39:50 compute-0 sudo[101096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:50 compute-0 python3.9[101098]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:39:50 compute-0 sudo[101096]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:51 compute-0 sudo[101248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovgqtkrgvefnxuxtruhcxkqcwnfaapdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757990.7009993-610-14715077689208/AnsiballZ_systemd.py'
Oct 06 13:39:51 compute-0 sudo[101248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:51 compute-0 python3.9[101250]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:39:51 compute-0 systemd[1]: Reloading.
Oct 06 13:39:51 compute-0 systemd-rc-local-generator[101277]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:39:51 compute-0 systemd-sysv-generator[101281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:39:51 compute-0 systemd[1]: Starting Create netns directory...
Oct 06 13:39:51 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 06 13:39:51 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 06 13:39:51 compute-0 systemd[1]: Finished Create netns directory.
Oct 06 13:39:51 compute-0 sudo[101248]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:52 compute-0 sudo[101441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkulhiocafhihoiumjdcnlioqbxgeuma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757991.9385362-630-12293367184749/AnsiballZ_file.py'
Oct 06 13:39:52 compute-0 sudo[101441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:52 compute-0 python3.9[101443]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:52 compute-0 sudo[101441]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:53 compute-0 sudo[101593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixwpsejxrgazslxlkpjfxgnrogixpmvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757992.7355056-646-192293743134584/AnsiballZ_stat.py'
Oct 06 13:39:53 compute-0 sudo[101593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:53 compute-0 python3.9[101595]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:53 compute-0 sudo[101593]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:53 compute-0 sudo[101716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktujtvqgsbapaijnjsjevxrrlkxwptor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757992.7355056-646-192293743134584/AnsiballZ_copy.py'
Oct 06 13:39:53 compute-0 sudo[101716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:53 compute-0 python3.9[101718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759757992.7355056-646-192293743134584/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:53 compute-0 sudo[101716]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:54 compute-0 sudo[101868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scysbnlqnaapjjeixihdlcsrwhjwvzck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757994.269969-680-176039870769777/AnsiballZ_file.py'
Oct 06 13:39:54 compute-0 sudo[101868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:54 compute-0 python3.9[101870]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:39:54 compute-0 sudo[101868]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:55 compute-0 sudo[102020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsxkydwzrmonszbwypmcqiiddtkcncem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757995.0343294-696-25770723481817/AnsiballZ_stat.py'
Oct 06 13:39:55 compute-0 sudo[102020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:55 compute-0 python3.9[102022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:39:55 compute-0 sudo[102020]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:55 compute-0 sudo[102143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byspkthnrxjhrhmjxtmisxtkvzjeifvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757995.0343294-696-25770723481817/AnsiballZ_copy.py'
Oct 06 13:39:55 compute-0 sudo[102143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:56 compute-0 python3.9[102145]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759757995.0343294-696-25770723481817/.source.json _original_basename=.r4z4y_w9 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:39:56 compute-0 sudo[102143]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:56 compute-0 sudo[102295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blmtjrxvesgfedncdbcmpdajlvhsxuix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757996.2967525-726-157048039282980/AnsiballZ_file.py'
Oct 06 13:39:56 compute-0 sudo[102295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:56 compute-0 python3.9[102297]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:39:56 compute-0 sudo[102295]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:57 compute-0 sudo[102447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaqtjjcrtfbgrifgldbjzqfzitkrasqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757997.1211157-742-128993920364227/AnsiballZ_stat.py'
Oct 06 13:39:57 compute-0 sudo[102447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:57 compute-0 sudo[102447]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:58 compute-0 sudo[102570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgifblkxotxmzcilkxneharaumtdfipy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757997.1211157-742-128993920364227/AnsiballZ_copy.py'
Oct 06 13:39:58 compute-0 sudo[102570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:58 compute-0 sudo[102570]: pam_unix(sudo:session): session closed for user root
Oct 06 13:39:59 compute-0 sudo[102722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wokstzvrabymfcirckjehxshwjsmfalu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757998.758018-776-216323835632718/AnsiballZ_container_config_data.py'
Oct 06 13:39:59 compute-0 sudo[102722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:39:59 compute-0 python3.9[102724]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct 06 13:39:59 compute-0 sudo[102722]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:00 compute-0 sudo[102874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpfbdhdflhvhvjgqnztheekywhnpkdja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759757999.7386334-794-196277719400787/AnsiballZ_container_config_hash.py'
Oct 06 13:40:00 compute-0 sudo[102874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:00 compute-0 python3.9[102876]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 06 13:40:00 compute-0 sudo[102874]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:01 compute-0 sudo[103026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwaqgtaoogmmhzhbbyekifxpcxxrpdna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758000.7190568-812-260120178270944/AnsiballZ_podman_container_info.py'
Oct 06 13:40:01 compute-0 sudo[103026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:01 compute-0 python3.9[103028]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 06 13:40:01 compute-0 sudo[103026]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:02 compute-0 sudo[103204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gybevtwovwacgrygamkijwuydphqxsfk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759758002.1051798-838-75413002188871/AnsiballZ_edpm_container_manage.py'
Oct 06 13:40:02 compute-0 sudo[103204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:02 compute-0 python3[103206]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 06 13:40:03 compute-0 podman[103243]: 2025-10-06 13:40:03.143886485 +0000 UTC m=+0.066303381 container create d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct 06 13:40:03 compute-0 podman[103243]: 2025-10-06 13:40:03.106520837 +0000 UTC m=+0.028937823 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 13:40:03 compute-0 python3[103206]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 13:40:03 compute-0 sudo[103204]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:03 compute-0 sudo[103430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgikqvlqujwybomqakohcjncecewvadq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758003.5300953-854-113201111049010/AnsiballZ_stat.py'
Oct 06 13:40:03 compute-0 sudo[103430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:04 compute-0 python3.9[103432]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:40:04 compute-0 sudo[103430]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:04 compute-0 sudo[103584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fggndxpujznakleowwqkzzaqqawkzzoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758004.3587646-872-96241799205532/AnsiballZ_file.py'
Oct 06 13:40:04 compute-0 sudo[103584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:04 compute-0 python3.9[103586]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:04 compute-0 sudo[103584]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:05 compute-0 sudo[103660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vocjgchwffpegzwcgopbjyvvwgcpoanr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758004.3587646-872-96241799205532/AnsiballZ_stat.py'
Oct 06 13:40:05 compute-0 sudo[103660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:05 compute-0 python3.9[103662]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:40:05 compute-0 sudo[103660]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:05 compute-0 sudo[103811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkjwluadswpnzbxdebuhyegevnkvjzls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758005.4496684-872-146067669752931/AnsiballZ_copy.py'
Oct 06 13:40:05 compute-0 sudo[103811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:06 compute-0 python3.9[103813]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759758005.4496684-872-146067669752931/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:06 compute-0 sudo[103811]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:06 compute-0 sudo[103887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiyayxtgycjhvsymttfufjcnokmcxbbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758005.4496684-872-146067669752931/AnsiballZ_systemd.py'
Oct 06 13:40:06 compute-0 sudo[103887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:06 compute-0 python3.9[103889]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:40:06 compute-0 systemd[1]: Reloading.
Oct 06 13:40:06 compute-0 systemd-rc-local-generator[103917]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:40:06 compute-0 systemd-sysv-generator[103920]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:40:07 compute-0 sudo[103887]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:07 compute-0 sudo[103998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgkfuyrxumtgimohxhldzclfpyzjsbqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758005.4496684-872-146067669752931/AnsiballZ_systemd.py'
Oct 06 13:40:07 compute-0 sudo[103998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:07 compute-0 python3.9[104000]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:40:08 compute-0 systemd[1]: Reloading.
Oct 06 13:40:08 compute-0 systemd-rc-local-generator[104029]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:40:08 compute-0 systemd-sysv-generator[104035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:40:09 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Oct 06 13:40:09 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:40:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac0da805a1e5fa71497b28e79d8341a2402ca9851446e1c2f6f8d6cad3a68b02/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 06 13:40:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac0da805a1e5fa71497b28e79d8341a2402ca9851446e1c2f6f8d6cad3a68b02/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 13:40:09 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9.
Oct 06 13:40:09 compute-0 podman[104041]: 2025-10-06 13:40:09.328705486 +0000 UTC m=+0.296290907 container init d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: + sudo -E kolla_set_configs
Oct 06 13:40:09 compute-0 podman[104041]: 2025-10-06 13:40:09.36709341 +0000 UTC m=+0.334678811 container start d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 13:40:09 compute-0 edpm-start-podman-container[104041]: ovn_metadata_agent
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Validating config file
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Copying service configuration files
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Writing out command to execute
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: ++ cat /run_command
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: + CMD=neutron-ovn-metadata-agent
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: + ARGS=
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: + sudo kolla_copy_cacerts
Oct 06 13:40:09 compute-0 edpm-start-podman-container[104040]: Creating additional drop-in dependency for "ovn_metadata_agent" (d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9)
Oct 06 13:40:09 compute-0 podman[104074]: 2025-10-06 13:40:09.453069904 +0000 UTC m=+0.075544107 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: Running command: 'neutron-ovn-metadata-agent'
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: + [[ ! -n '' ]]
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: + . kolla_extend_start
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: + umask 0022
Oct 06 13:40:09 compute-0 ovn_metadata_agent[104057]: + exec neutron-ovn-metadata-agent
Oct 06 13:40:09 compute-0 podman[104060]: 2025-10-06 13:40:09.46695308 +0000 UTC m=+0.146383924 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_managed=true)
Oct 06 13:40:09 compute-0 systemd[1]: Reloading.
Oct 06 13:40:09 compute-0 systemd-sysv-generator[104166]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:40:09 compute-0 systemd-rc-local-generator[104163]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:40:09 compute-0 systemd[1]: Started ovn_metadata_agent container.
Oct 06 13:40:09 compute-0 sudo[103998]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:10 compute-0 sshd-session[95809]: Connection closed by 192.168.122.30 port 60774
Oct 06 13:40:10 compute-0 sshd-session[95806]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:40:10 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Oct 06 13:40:10 compute-0 systemd[1]: session-23.scope: Consumed 40.363s CPU time.
Oct 06 13:40:10 compute-0 systemd-logind[789]: Session 23 logged out. Waiting for processes to exit.
Oct 06 13:40:10 compute-0 systemd-logind[789]: Removed session 23.
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.252 104072 INFO neutron.common.config [-] Logging enabled!
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.253 104072 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.253 104072 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.254 104072 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.255 104072 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.255 104072 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.255 104072 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.255 104072 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.255 104072 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.255 104072 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.255 104072 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.255 104072 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.255 104072 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.256 104072 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.256 104072 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.256 104072 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.256 104072 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.256 104072 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.256 104072 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.257 104072 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.257 104072 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.257 104072 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.257 104072 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.257 104072 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.257 104072 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.257 104072 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.257 104072 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.258 104072 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.258 104072 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.258 104072 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.258 104072 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.258 104072 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.258 104072 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.259 104072 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.259 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.259 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.259 104072 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.259 104072 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.259 104072 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.259 104072 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.259 104072 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.260 104072 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.260 104072 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.260 104072 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.260 104072 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.260 104072 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.260 104072 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.260 104072 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.261 104072 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.261 104072 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.261 104072 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.261 104072 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.261 104072 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.261 104072 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.261 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.261 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.262 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.262 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.262 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.262 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.262 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.262 104072 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.150 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.262 104072 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.263 104072 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.263 104072 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.263 104072 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.263 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.263 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.263 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.264 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.264 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.264 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.264 104072 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.264 104072 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.264 104072 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.264 104072 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.264 104072 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.264 104072 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.265 104072 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.266 104072 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.266 104072 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.266 104072 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.266 104072 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.266 104072 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.266 104072 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.266 104072 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.267 104072 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.267 104072 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.267 104072 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.267 104072 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.267 104072 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.267 104072 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.267 104072 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.268 104072 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.268 104072 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.268 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.268 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.268 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.268 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.268 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.269 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.269 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.269 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.269 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.269 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.269 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.270 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.270 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.270 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.270 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.270 104072 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.270 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.271 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.271 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.271 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.271 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.271 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.271 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.272 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.272 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.272 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.272 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.272 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.272 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.272 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.273 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.273 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.273 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.273 104072 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.273 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.273 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.273 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.274 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.274 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.274 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.274 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.274 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.274 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.274 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.275 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.275 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.275 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.275 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.275 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.275 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.276 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.277 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.277 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.277 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.277 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.277 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.277 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.277 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.277 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.278 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.278 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.278 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.278 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.278 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.278 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.278 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.279 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.279 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.279 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.279 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.279 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.279 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.280 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.280 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.280 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.280 104072 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.281 104072 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.281 104072 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.281 104072 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.281 104072 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.281 104072 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.281 104072 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.281 104072 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.282 104072 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.282 104072 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.282 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.282 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.282 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.282 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.283 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.283 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.283 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.283 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.283 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.284 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.284 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.284 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.284 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.284 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.284 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.285 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.285 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.285 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.285 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.285 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.285 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.285 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.286 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.286 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.286 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.286 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.286 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.286 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.287 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.287 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.287 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.287 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.287 104072 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.287 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.287 104072 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.288 104072 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.288 104072 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.288 104072 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.288 104072 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.288 104072 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.288 104072 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.289 104072 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.289 104072 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.289 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.289 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.289 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.290 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.290 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.290 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.290 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.290 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.291 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.291 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.291 104072 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.291 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.292 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.292 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.292 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.292 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.292 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.292 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.292 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.292 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.292 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.293 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.294 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.294 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.294 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.294 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.294 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.294 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.294 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.294 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.294 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.295 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.296 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.297 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.298 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.298 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.298 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.298 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.298 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.298 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.298 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.298 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.298 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.299 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.299 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.299 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.299 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.299 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.299 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.299 104072 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.299 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.299 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.300 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.300 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.300 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.300 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.300 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.300 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.300 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.300 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.300 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.301 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.301 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.301 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.301 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.301 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.301 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.301 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.302 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.302 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.302 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.302 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.302 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.302 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.302 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.303 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.303 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.303 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.303 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.303 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.303 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.303 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.304 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.304 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.304 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.304 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.304 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.304 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.304 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.305 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.305 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.305 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.305 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.305 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.305 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.305 104072 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.305 104072 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.317 104072 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.317 104072 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.318 104072 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.318 104072 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.318 104072 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.334 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 6cb79b8b-7bef-432f-9e10-9690a1ce5aa4 (UUID: 6cb79b8b-7bef-432f-9e10-9690a1ce5aa4) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.358 104072 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.358 104072 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.358 104072 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.358 104072 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.358 104072 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.362 104072 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.366 104072 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.374 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '6cb79b8b-7bef-432f-9e10-9690a1ce5aa4'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], external_ids={}, name=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, nb_cfg_timestamp=1759757958326, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 13:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.377 104072 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpxoqrdymv/privsep.sock']
Oct 06 13:40:11 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:12.140 104072 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:12.140 104072 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpxoqrdymv/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.968 104207 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.974 104207 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.977 104207 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:11.977 104207 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104207
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:12.142 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[73747046-099c-418d-bb7f-e9ecb6bab093]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:12.612 104207 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:12.612 104207 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:40:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:12.612 104207 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:40:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:13.057 104207 INFO oslo_service.backend [-] Loading backend: eventlet
Oct 06 13:40:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:13.062 104207 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Oct 06 13:40:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:13.098 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[f3948a56-af22-49a0-aedc-491db36e19b2]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 13:40:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:13.100 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, column=external_ids, values=({'neutron:ovn-metadata-id': '7a210ef4-3690-5ae3-9c96-3626080bacf4'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 13:40:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:13.121 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 13:40:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:40:13.137 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 13:40:15 compute-0 sshd-session[104212]: Accepted publickey for zuul from 192.168.122.30 port 34596 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:40:15 compute-0 systemd-logind[789]: New session 24 of user zuul.
Oct 06 13:40:15 compute-0 systemd[1]: Started Session 24 of User zuul.
Oct 06 13:40:15 compute-0 sshd-session[104212]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:40:16 compute-0 python3.9[104365]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:40:17 compute-0 sudo[104519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cghlykbnngneagcgsiroiajagdutgshd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758016.9893904-48-6477476993986/AnsiballZ_command.py'
Oct 06 13:40:17 compute-0 sudo[104519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:17 compute-0 python3.9[104521]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:40:17 compute-0 sudo[104519]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:18 compute-0 sudo[104684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wndermtmvytolmwpveeslnhvfngkywwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758018.023663-70-1639805120500/AnsiballZ_systemd_service.py'
Oct 06 13:40:18 compute-0 sudo[104684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:18 compute-0 python3.9[104686]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:40:18 compute-0 systemd[1]: Reloading.
Oct 06 13:40:18 compute-0 systemd-rc-local-generator[104709]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:40:18 compute-0 systemd-sysv-generator[104715]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:40:19 compute-0 sudo[104684]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:20 compute-0 python3.9[104871]: ansible-ansible.builtin.service_facts Invoked
Oct 06 13:40:20 compute-0 network[104888]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 06 13:40:20 compute-0 network[104889]: 'network-scripts' will be removed from distribution in near future.
Oct 06 13:40:20 compute-0 network[104890]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 06 13:40:24 compute-0 sudo[105152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bblfdjdhcjzytarxpmkinikxlndcbfwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758024.496134-108-22955922588337/AnsiballZ_systemd_service.py'
Oct 06 13:40:24 compute-0 sudo[105152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:25 compute-0 python3.9[105154]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:40:25 compute-0 sudo[105152]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:25 compute-0 sudo[105305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mndhrieefrtgmvdhvtjzlppjszflhtff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758025.3604975-108-192439866010896/AnsiballZ_systemd_service.py'
Oct 06 13:40:25 compute-0 sudo[105305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:25 compute-0 python3.9[105307]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:40:25 compute-0 sudo[105305]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:26 compute-0 sudo[105458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-natrwxjugoqogbjdslznewzjusfehywc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758026.1056528-108-56477033121280/AnsiballZ_systemd_service.py'
Oct 06 13:40:26 compute-0 sudo[105458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:26 compute-0 python3.9[105460]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:40:26 compute-0 sudo[105458]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:27 compute-0 sudo[105611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwnqbtlicmybakkytkprpqhvjsufljwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758026.9791193-108-190179193235622/AnsiballZ_systemd_service.py'
Oct 06 13:40:27 compute-0 sudo[105611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:27 compute-0 python3.9[105613]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:40:27 compute-0 sudo[105611]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:28 compute-0 sudo[105764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkfcezthklqwfmwihhgwflfkziilzwtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758027.9240775-108-191320241201792/AnsiballZ_systemd_service.py'
Oct 06 13:40:28 compute-0 sudo[105764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:28 compute-0 python3.9[105766]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:40:28 compute-0 sudo[105764]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:29 compute-0 sudo[105917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqgubxzohioruhejqatmduihdboufydx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758028.7813876-108-126612328106363/AnsiballZ_systemd_service.py'
Oct 06 13:40:29 compute-0 sudo[105917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:29 compute-0 python3.9[105919]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:40:29 compute-0 sudo[105917]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:30 compute-0 sudo[106070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjqfbgrdthhqfrlqhrfugogrfjzrzexr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758029.6573985-108-29655554774086/AnsiballZ_systemd_service.py'
Oct 06 13:40:30 compute-0 sudo[106070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:30 compute-0 python3.9[106072]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:40:30 compute-0 sudo[106070]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:31 compute-0 sudo[106223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhmrwxrmlopruudhcjvxxjczllshdsjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758030.686014-212-39422693652965/AnsiballZ_file.py'
Oct 06 13:40:31 compute-0 sudo[106223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:31 compute-0 python3.9[106225]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:31 compute-0 sudo[106223]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:31 compute-0 sudo[106375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyyhtmnyeritniiqtjhwxcdwlkvhsxer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758031.5477512-212-56309981393911/AnsiballZ_file.py'
Oct 06 13:40:31 compute-0 sudo[106375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:32 compute-0 python3.9[106377]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:32 compute-0 sudo[106375]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:32 compute-0 sudo[106527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkgcwgmleugonefzcbdvbjvuoygbpvvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758032.1852083-212-6069328401191/AnsiballZ_file.py'
Oct 06 13:40:32 compute-0 sudo[106527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:32 compute-0 python3.9[106529]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:32 compute-0 sudo[106527]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:33 compute-0 sudo[106679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqssgkzsfbqxwmuuczoialffoojqofft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758032.9019427-212-214559605902294/AnsiballZ_file.py'
Oct 06 13:40:33 compute-0 sudo[106679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:33 compute-0 python3.9[106681]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:33 compute-0 sudo[106679]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:33 compute-0 sudo[106831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtqeyikieocktiimloneywmonmbyrjms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758033.6412458-212-183668614271714/AnsiballZ_file.py'
Oct 06 13:40:33 compute-0 sudo[106831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:34 compute-0 python3.9[106833]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:34 compute-0 sudo[106831]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:34 compute-0 sudo[106983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utxsarlatdyfugbntsedwkgbrvsftxnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758034.3829768-212-234297985069579/AnsiballZ_file.py'
Oct 06 13:40:34 compute-0 sudo[106983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:34 compute-0 python3.9[106985]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:34 compute-0 sudo[106983]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:35 compute-0 sudo[107135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghnzjsdwkfopwadabcvmkxyngebddzwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758035.1496243-212-243224568371497/AnsiballZ_file.py'
Oct 06 13:40:35 compute-0 sudo[107135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:35 compute-0 python3.9[107137]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:35 compute-0 sudo[107135]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:36 compute-0 sudo[107287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eherraevctlznklqfgvjopzbaptjbjew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758035.9093676-312-59144103692811/AnsiballZ_file.py'
Oct 06 13:40:36 compute-0 sudo[107287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:36 compute-0 python3.9[107289]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:36 compute-0 sudo[107287]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:37 compute-0 sudo[107439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlisjnhisuixpragvjpzglwlnfyyegar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758036.706767-312-186311820012470/AnsiballZ_file.py'
Oct 06 13:40:37 compute-0 sudo[107439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:37 compute-0 python3.9[107441]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:37 compute-0 sudo[107439]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:37 compute-0 sudo[107591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnyfnjvrgpdzbqegwhovrldexspjxewd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758037.4913626-312-33285514541326/AnsiballZ_file.py'
Oct 06 13:40:37 compute-0 sudo[107591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:37 compute-0 python3.9[107593]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:38 compute-0 sudo[107591]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:38 compute-0 sudo[107743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccndaagxvcsiyizecplgtjcnuwzydqof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758038.157949-312-70339582626990/AnsiballZ_file.py'
Oct 06 13:40:38 compute-0 sudo[107743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:38 compute-0 python3.9[107745]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:38 compute-0 sudo[107743]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:39 compute-0 sudo[107895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufdxblhwynmdpscnysbphdwvyyogrztu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758038.8351684-312-54968403714799/AnsiballZ_file.py'
Oct 06 13:40:39 compute-0 sudo[107895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:39 compute-0 python3.9[107897]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:39 compute-0 sudo[107895]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:39 compute-0 sudo[108073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcwlfamvyozunzfipyyprlmvikhcynxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758039.4778128-312-147286536470188/AnsiballZ_file.py'
Oct 06 13:40:39 compute-0 sudo[108073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:39 compute-0 podman[108022]: 2025-10-06 13:40:39.862257919 +0000 UTC m=+0.083256715 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 13:40:39 compute-0 podman[108021]: 2025-10-06 13:40:39.918777798 +0000 UTC m=+0.140606476 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:40:40 compute-0 python3.9[108087]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:40 compute-0 sudo[108073]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:40 compute-0 sudo[108245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oifdfzkjfhvawppuhxunojvriwnrsmzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758040.218661-312-184827699840106/AnsiballZ_file.py'
Oct 06 13:40:40 compute-0 sudo[108245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:40 compute-0 python3.9[108247]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:40:40 compute-0 sudo[108245]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:41 compute-0 sudo[108397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngawsbtijhkwlekqyicprtiolkjkvbcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758041.100487-414-193815467010746/AnsiballZ_command.py'
Oct 06 13:40:41 compute-0 sudo[108397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:41 compute-0 python3.9[108399]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:40:41 compute-0 sudo[108397]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:42 compute-0 python3.9[108551]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 06 13:40:43 compute-0 sudo[108701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qplobybamgnqzhuwsrqttwtpqgvwialw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758042.9105067-450-1706389801029/AnsiballZ_systemd_service.py'
Oct 06 13:40:43 compute-0 sudo[108701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:43 compute-0 python3.9[108703]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:40:43 compute-0 systemd[1]: Reloading.
Oct 06 13:40:43 compute-0 systemd-rc-local-generator[108731]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:40:43 compute-0 systemd-sysv-generator[108735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:40:43 compute-0 sudo[108701]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:44 compute-0 sudo[108889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcntklieeggklvhnswuampfivnvxuffx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758044.083614-466-174690864383951/AnsiballZ_command.py'
Oct 06 13:40:44 compute-0 sudo[108889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:44 compute-0 python3.9[108891]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:40:44 compute-0 sudo[108889]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:45 compute-0 sudo[109042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjfnkranrqtueckgvtjqloyhocqqmdqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758044.845645-466-6469529163459/AnsiballZ_command.py'
Oct 06 13:40:45 compute-0 sudo[109042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:45 compute-0 python3.9[109044]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:40:45 compute-0 sudo[109042]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:45 compute-0 sudo[109195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ankgnmmlnxucuemivlaxujpculhkhhlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758045.6069236-466-221793072649023/AnsiballZ_command.py'
Oct 06 13:40:45 compute-0 sudo[109195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:46 compute-0 python3.9[109197]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:40:46 compute-0 sudo[109195]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:46 compute-0 sudo[109348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snmvnvjxsaovlqabqayagowufwgbnngt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758046.3208332-466-22675887350588/AnsiballZ_command.py'
Oct 06 13:40:46 compute-0 sudo[109348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:46 compute-0 python3.9[109350]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:40:46 compute-0 sudo[109348]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:47 compute-0 sudo[109501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maxerulikumsqbobkurmxbjcubjmdnkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758047.0577993-466-137030360753925/AnsiballZ_command.py'
Oct 06 13:40:47 compute-0 sudo[109501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:47 compute-0 python3.9[109503]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:40:47 compute-0 sudo[109501]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:48 compute-0 sudo[109654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-magjhcgegidwmpmflehghnzkzdakzgdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758047.8254197-466-191025666813265/AnsiballZ_command.py'
Oct 06 13:40:48 compute-0 sudo[109654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:48 compute-0 python3.9[109656]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:40:48 compute-0 sudo[109654]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:48 compute-0 sudo[109807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lejvpsgkhivlhvqquqmjayqjtgfokmwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758048.5702257-466-267607817993823/AnsiballZ_command.py'
Oct 06 13:40:48 compute-0 sudo[109807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:49 compute-0 python3.9[109809]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:40:49 compute-0 sudo[109807]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:50 compute-0 sudo[109960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buuaancajhywbdiqjdizkfouhulmbxaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758049.6541553-574-262453532240251/AnsiballZ_getent.py'
Oct 06 13:40:50 compute-0 sudo[109960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:50 compute-0 python3.9[109962]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct 06 13:40:50 compute-0 sudo[109960]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:51 compute-0 sudo[110113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmrmaebgcjxjwfweiafxgvxwnoycnabi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758050.5793924-590-74159873325493/AnsiballZ_group.py'
Oct 06 13:40:51 compute-0 sudo[110113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:51 compute-0 python3.9[110115]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 06 13:40:51 compute-0 groupadd[110116]: group added to /etc/group: name=libvirt, GID=42473
Oct 06 13:40:51 compute-0 groupadd[110116]: group added to /etc/gshadow: name=libvirt
Oct 06 13:40:51 compute-0 groupadd[110116]: new group: name=libvirt, GID=42473
Oct 06 13:40:51 compute-0 sudo[110113]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:52 compute-0 sudo[110271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xubmfwhjzqirwdwckzubwthtczlufwtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758051.5955536-606-13316887588561/AnsiballZ_user.py'
Oct 06 13:40:52 compute-0 sudo[110271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:52 compute-0 python3.9[110273]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 06 13:40:52 compute-0 useradd[110275]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Oct 06 13:40:52 compute-0 sudo[110271]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:53 compute-0 sudo[110431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtvnagtlrjtxqiqjtdylbpbiirehiatw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758052.8939009-628-25454117152740/AnsiballZ_setup.py'
Oct 06 13:40:53 compute-0 sudo[110431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:53 compute-0 python3.9[110433]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:40:53 compute-0 sudo[110431]: pam_unix(sudo:session): session closed for user root
Oct 06 13:40:54 compute-0 sudo[110515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koeomavqdiyqkqbtpvyktyndfvlnqlhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758052.8939009-628-25454117152740/AnsiballZ_dnf.py'
Oct 06 13:40:54 compute-0 sudo[110515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:40:54 compute-0 python3.9[110517]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:41:10 compute-0 podman[110703]: 2025-10-06 13:41:10.208152526 +0000 UTC m=+0.064436850 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 06 13:41:10 compute-0 podman[110702]: 2025-10-06 13:41:10.324802388 +0000 UTC m=+0.181548354 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 13:41:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:41:11.307 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:41:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:41:11.308 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:41:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:41:11.308 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:41:21 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Oct 06 13:41:21 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:41:21 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 06 13:41:21 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:41:21 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:41:21 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:41:21 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:41:21 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:41:30 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Oct 06 13:41:30 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:41:30 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 06 13:41:30 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:41:30 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:41:30 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:41:30 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:41:30 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:41:41 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct 06 13:41:41 compute-0 podman[110770]: 2025-10-06 13:41:41.279939882 +0000 UTC m=+0.115757069 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 13:41:41 compute-0 podman[110769]: 2025-10-06 13:41:41.303186744 +0000 UTC m=+0.139099194 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 06 13:42:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:42:11.309 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:42:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:42:11.310 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:42:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:42:11.310 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:42:12 compute-0 podman[124669]: 2025-10-06 13:42:12.192823578 +0000 UTC m=+0.056838048 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true)
Oct 06 13:42:12 compute-0 podman[124659]: 2025-10-06 13:42:12.263076616 +0000 UTC m=+0.119862428 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 06 13:42:34 compute-0 kernel: SELinux:  Converting 2755 SID table entries...
Oct 06 13:42:34 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 06 13:42:34 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 06 13:42:34 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 06 13:42:34 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 06 13:42:34 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 06 13:42:34 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 06 13:42:34 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 06 13:42:35 compute-0 groupadd[127618]: group added to /etc/group: name=dnsmasq, GID=992
Oct 06 13:42:35 compute-0 groupadd[127618]: group added to /etc/gshadow: name=dnsmasq
Oct 06 13:42:35 compute-0 groupadd[127618]: new group: name=dnsmasq, GID=992
Oct 06 13:42:35 compute-0 useradd[127625]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Oct 06 13:42:35 compute-0 dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Oct 06 13:42:35 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct 06 13:42:35 compute-0 dbus-broker-launch[742]: Noticed file-system modification, trigger reload.
Oct 06 13:42:36 compute-0 groupadd[127638]: group added to /etc/group: name=clevis, GID=991
Oct 06 13:42:36 compute-0 groupadd[127638]: group added to /etc/gshadow: name=clevis
Oct 06 13:42:36 compute-0 groupadd[127638]: new group: name=clevis, GID=991
Oct 06 13:42:36 compute-0 useradd[127645]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Oct 06 13:42:36 compute-0 usermod[127655]: add 'clevis' to group 'tss'
Oct 06 13:42:36 compute-0 usermod[127655]: add 'clevis' to shadow group 'tss'
Oct 06 13:42:38 compute-0 polkitd[6342]: Reloading rules
Oct 06 13:42:38 compute-0 polkitd[6342]: Collecting garbage unconditionally...
Oct 06 13:42:38 compute-0 polkitd[6342]: Loading rules from directory /etc/polkit-1/rules.d
Oct 06 13:42:38 compute-0 polkitd[6342]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 06 13:42:38 compute-0 polkitd[6342]: Finished loading, compiling and executing 4 rules
Oct 06 13:42:38 compute-0 polkitd[6342]: Reloading rules
Oct 06 13:42:38 compute-0 polkitd[6342]: Collecting garbage unconditionally...
Oct 06 13:42:38 compute-0 polkitd[6342]: Loading rules from directory /etc/polkit-1/rules.d
Oct 06 13:42:38 compute-0 polkitd[6342]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 06 13:42:38 compute-0 polkitd[6342]: Finished loading, compiling and executing 4 rules
Oct 06 13:42:40 compute-0 groupadd[127842]: group added to /etc/group: name=ceph, GID=167
Oct 06 13:42:40 compute-0 groupadd[127842]: group added to /etc/gshadow: name=ceph
Oct 06 13:42:40 compute-0 groupadd[127842]: new group: name=ceph, GID=167
Oct 06 13:42:40 compute-0 useradd[127848]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Oct 06 13:42:42 compute-0 podman[127857]: 2025-10-06 13:42:42.400861488 +0000 UTC m=+0.084281161 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 13:42:42 compute-0 podman[127858]: 2025-10-06 13:42:42.451856245 +0000 UTC m=+0.134380114 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct 06 13:42:43 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Oct 06 13:42:43 compute-0 sshd[1003]: Received signal 15; terminating.
Oct 06 13:42:43 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Oct 06 13:42:43 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Oct 06 13:42:43 compute-0 systemd[1]: sshd.service: Consumed 6.416s CPU time, no IO.
Oct 06 13:42:43 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Oct 06 13:42:43 compute-0 systemd[1]: Stopping sshd-keygen.target...
Oct 06 13:42:43 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 06 13:42:43 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 06 13:42:43 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 06 13:42:43 compute-0 systemd[1]: Reached target sshd-keygen.target.
Oct 06 13:42:43 compute-0 systemd[1]: Starting OpenSSH server daemon...
Oct 06 13:42:43 compute-0 sshd[128410]: Server listening on 0.0.0.0 port 22.
Oct 06 13:42:43 compute-0 sshd[128410]: Server listening on :: port 22.
Oct 06 13:42:43 compute-0 systemd[1]: Started OpenSSH server daemon.
Oct 06 13:42:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 06 13:42:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 06 13:42:46 compute-0 systemd[1]: Reloading.
Oct 06 13:42:46 compute-0 systemd-rc-local-generator[128667]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:42:46 compute-0 systemd-sysv-generator[128670]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:42:46 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 06 13:42:48 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct 06 13:42:48 compute-0 PackageKit[130367]: daemon start
Oct 06 13:42:48 compute-0 systemd[1]: Started PackageKit Daemon.
Oct 06 13:42:49 compute-0 sudo[110515]: pam_unix(sudo:session): session closed for user root
Oct 06 13:42:50 compute-0 sudo[131598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfrwkxycmvdracouktjwxdiknrxvjufj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758169.4059134-652-139237645686520/AnsiballZ_systemd.py'
Oct 06 13:42:50 compute-0 sudo[131598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:42:50 compute-0 python3.9[131628]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 06 13:42:50 compute-0 systemd[1]: Reloading.
Oct 06 13:42:50 compute-0 systemd-rc-local-generator[132013]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:42:50 compute-0 systemd-sysv-generator[132017]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:42:50 compute-0 sudo[131598]: pam_unix(sudo:session): session closed for user root
Oct 06 13:42:51 compute-0 sudo[132723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymnaakuscowjmkzcsbqvwofmwapmkelv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758170.8686044-652-164014860531996/AnsiballZ_systemd.py'
Oct 06 13:42:51 compute-0 sudo[132723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:42:51 compute-0 python3.9[132753]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 06 13:42:51 compute-0 systemd[1]: Reloading.
Oct 06 13:42:51 compute-0 systemd-rc-local-generator[133075]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:42:51 compute-0 systemd-sysv-generator[133078]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:42:51 compute-0 sudo[132723]: pam_unix(sudo:session): session closed for user root
Oct 06 13:42:52 compute-0 sudo[133773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzhqyblgqdhtgyldmadczcmcpuoomsbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758172.005318-652-89351777929086/AnsiballZ_systemd.py'
Oct 06 13:42:52 compute-0 sudo[133773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:42:52 compute-0 python3.9[133792]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 06 13:42:52 compute-0 systemd[1]: Reloading.
Oct 06 13:42:52 compute-0 systemd-rc-local-generator[134221]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:42:52 compute-0 systemd-sysv-generator[134228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:42:52 compute-0 sudo[133773]: pam_unix(sudo:session): session closed for user root
Oct 06 13:42:53 compute-0 sudo[134952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkbnjnvzxutbogdplmfkuoqpvaqvugey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758173.0595841-652-132951893511796/AnsiballZ_systemd.py'
Oct 06 13:42:53 compute-0 sudo[134952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:42:53 compute-0 python3.9[134977]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 06 13:42:53 compute-0 systemd[1]: Reloading.
Oct 06 13:42:53 compute-0 systemd-sysv-generator[135340]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:42:53 compute-0 systemd-rc-local-generator[135336]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:42:53 compute-0 sudo[134952]: pam_unix(sudo:session): session closed for user root
Oct 06 13:42:54 compute-0 sudo[136107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiyszzilyvszkfyokaisbfkxcaexefss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758174.1591399-710-37429746237993/AnsiballZ_systemd.py'
Oct 06 13:42:54 compute-0 sudo[136107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:42:54 compute-0 python3.9[136128]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:42:54 compute-0 systemd[1]: Reloading.
Oct 06 13:42:54 compute-0 systemd-sysv-generator[136699]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:42:54 compute-0 systemd-rc-local-generator[136693]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:42:55 compute-0 sudo[136107]: pam_unix(sudo:session): session closed for user root
Oct 06 13:42:55 compute-0 sudo[137307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itlhnarwjpykmncuzraldfnydubgvmms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758175.2683516-710-159289906616138/AnsiballZ_systemd.py'
Oct 06 13:42:55 compute-0 sudo[137307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:42:55 compute-0 python3.9[137326]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:42:55 compute-0 systemd[1]: Reloading.
Oct 06 13:42:56 compute-0 systemd-rc-local-generator[137693]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:42:56 compute-0 systemd-sysv-generator[137698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:42:56 compute-0 sudo[137307]: pam_unix(sudo:session): session closed for user root
Oct 06 13:42:56 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 06 13:42:56 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 06 13:42:56 compute-0 systemd[1]: man-db-cache-update.service: Consumed 13.017s CPU time.
Oct 06 13:42:56 compute-0 systemd[1]: run-rad6650fe27fd4cb1bf1c880e3fded42a.service: Deactivated successfully.
Oct 06 13:42:56 compute-0 sudo[138203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbrdhlbaebwotqtcipgldifdkbxzcgkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758176.481209-710-269048275794370/AnsiballZ_systemd.py'
Oct 06 13:42:56 compute-0 sudo[138203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:42:57 compute-0 python3.9[138205]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:42:57 compute-0 systemd[1]: Reloading.
Oct 06 13:42:57 compute-0 systemd-rc-local-generator[138231]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:42:57 compute-0 systemd-sysv-generator[138235]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:42:57 compute-0 sudo[138203]: pam_unix(sudo:session): session closed for user root
Oct 06 13:42:58 compute-0 sudo[138392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfnnolfwtlytdbsjpmhckhqfmdfrxiyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758177.7684038-710-49750363800056/AnsiballZ_systemd.py'
Oct 06 13:42:58 compute-0 sudo[138392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:42:58 compute-0 python3.9[138394]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:42:58 compute-0 sudo[138392]: pam_unix(sudo:session): session closed for user root
Oct 06 13:42:59 compute-0 sudo[138547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvfatomiglkrsylsnguplswjvhqbpmfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758178.757041-710-140438204718954/AnsiballZ_systemd.py'
Oct 06 13:42:59 compute-0 sudo[138547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:42:59 compute-0 python3.9[138549]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:00 compute-0 systemd[1]: Reloading.
Oct 06 13:43:00 compute-0 systemd-sysv-generator[138578]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:43:00 compute-0 systemd-rc-local-generator[138575]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:43:00 compute-0 sudo[138547]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:01 compute-0 sudo[138737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqdzlvptonkdmethdcwpkbiyvhxpcfbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758181.1006572-782-24628936510114/AnsiballZ_systemd.py'
Oct 06 13:43:01 compute-0 sudo[138737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:01 compute-0 python3.9[138739]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 06 13:43:01 compute-0 systemd[1]: Reloading.
Oct 06 13:43:01 compute-0 systemd-sysv-generator[138775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:43:01 compute-0 systemd-rc-local-generator[138771]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:43:02 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Oct 06 13:43:02 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct 06 13:43:02 compute-0 sudo[138737]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:02 compute-0 sudo[138930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpvuyturrhotidmmebkyhcpmuhahdaju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758182.4578516-798-25098839259564/AnsiballZ_systemd.py'
Oct 06 13:43:02 compute-0 sudo[138930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:03 compute-0 python3.9[138932]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:03 compute-0 sudo[138930]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:03 compute-0 sudo[139085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auntnqmrngizkktdllfgslurgwvsucmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758183.3764584-798-161654113099338/AnsiballZ_systemd.py'
Oct 06 13:43:03 compute-0 sudo[139085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:04 compute-0 python3.9[139087]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:05 compute-0 sudo[139085]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:05 compute-0 sudo[139240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yviemezaszleyjmzwgmgwwaiekwwbqow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758185.3713555-798-119953061913989/AnsiballZ_systemd.py'
Oct 06 13:43:05 compute-0 sudo[139240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:06 compute-0 python3.9[139242]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:06 compute-0 sudo[139240]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:06 compute-0 sudo[139395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvjpeijsgzhehdxvktrskpvevageljce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758186.3128097-798-230939742883321/AnsiballZ_systemd.py'
Oct 06 13:43:06 compute-0 sudo[139395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:06 compute-0 python3.9[139397]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:07 compute-0 sudo[139395]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:07 compute-0 sudo[139550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxgfsxuzzdvshdhcmwbxdpbkviiwuvya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758187.2052107-798-126495480120999/AnsiballZ_systemd.py'
Oct 06 13:43:07 compute-0 sudo[139550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:07 compute-0 python3.9[139552]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:08 compute-0 sudo[139550]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:09 compute-0 sudo[139705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dstlvofengdtbhhtvqfkjqchxevsmqmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758189.1606548-798-40157856279296/AnsiballZ_systemd.py'
Oct 06 13:43:09 compute-0 sudo[139705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:09 compute-0 python3.9[139707]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:09 compute-0 sudo[139705]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:10 compute-0 sudo[139860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deastrbnidnelbzazrpwkfooxondogdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758190.0267076-798-5492206953885/AnsiballZ_systemd.py'
Oct 06 13:43:10 compute-0 sudo[139860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:10 compute-0 python3.9[139862]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:10 compute-0 sudo[139860]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:11 compute-0 sudo[140015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzmvzbcmvgwqffflnvqrhsyarntdhhvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758190.9538329-798-178885924445964/AnsiballZ_systemd.py'
Oct 06 13:43:11 compute-0 sudo[140015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:43:11.311 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:43:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:43:11.312 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:43:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:43:11.312 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:43:11 compute-0 python3.9[140017]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:11 compute-0 sudo[140015]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:12 compute-0 sudo[140171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyxwhgzaqwjvlvooqqtiyrohldbfilbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758191.8470836-798-215046932716341/AnsiballZ_systemd.py'
Oct 06 13:43:12 compute-0 sudo[140171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:12 compute-0 python3.9[140173]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:12 compute-0 sudo[140171]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:12 compute-0 podman[140176]: 2025-10-06 13:43:12.611467663 +0000 UTC m=+0.076311637 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 06 13:43:12 compute-0 podman[140175]: 2025-10-06 13:43:12.652824423 +0000 UTC m=+0.117890603 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 06 13:43:13 compute-0 sudo[140369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsomuswdirfvjksjzekpvqufcjrxkxik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758192.7329433-798-191489866168049/AnsiballZ_systemd.py'
Oct 06 13:43:13 compute-0 sudo[140369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:13 compute-0 python3.9[140371]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:14 compute-0 sudo[140369]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:15 compute-0 sudo[140524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nutkbvivbsvlwthrknuybgrbmibyhbvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758194.6944134-798-105967853555513/AnsiballZ_systemd.py'
Oct 06 13:43:15 compute-0 sudo[140524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:15 compute-0 python3.9[140526]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:15 compute-0 sudo[140524]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:16 compute-0 sudo[140679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlrdiozemdxgynnxoacojgnagbpucjgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758195.6403983-798-218668577317869/AnsiballZ_systemd.py'
Oct 06 13:43:16 compute-0 sudo[140679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:16 compute-0 python3.9[140681]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:16 compute-0 sudo[140679]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:16 compute-0 sudo[140834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzmbpiyjyvtjmdgkyhmthlofbzeatdah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758196.6316972-798-273911125419034/AnsiballZ_systemd.py'
Oct 06 13:43:16 compute-0 sudo[140834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:17 compute-0 python3.9[140836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:18 compute-0 sudo[140834]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:18 compute-0 sudo[140989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gypkcnwribhafrewuppogtratkejwauw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758198.5666387-798-129766677564631/AnsiballZ_systemd.py'
Oct 06 13:43:18 compute-0 sudo[140989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:19 compute-0 python3.9[140991]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 06 13:43:19 compute-0 sudo[140989]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:20 compute-0 sudo[141144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpfjqucfmuojgwqaoqgsbkazdspglpfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758199.7254643-1002-268061352161291/AnsiballZ_file.py'
Oct 06 13:43:20 compute-0 sudo[141144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:20 compute-0 python3.9[141146]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:43:20 compute-0 sudo[141144]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:20 compute-0 sudo[141296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uptkqurcxrkhyjthfdffpvmqbimpvcba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758200.4097948-1002-183396430805279/AnsiballZ_file.py'
Oct 06 13:43:20 compute-0 sudo[141296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:20 compute-0 python3.9[141298]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:43:20 compute-0 sudo[141296]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:21 compute-0 sudo[141448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylxqkkumdbdfoofnfhvaiwhaclxriujm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758201.091775-1002-129454419492603/AnsiballZ_file.py'
Oct 06 13:43:21 compute-0 sudo[141448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:21 compute-0 python3.9[141450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:43:21 compute-0 sudo[141448]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:22 compute-0 sudo[141600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzmhijqgidrfyemgkjhlsnaicrgmmdzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758201.7525034-1002-39868425480680/AnsiballZ_file.py'
Oct 06 13:43:22 compute-0 sudo[141600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:22 compute-0 python3.9[141602]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:43:22 compute-0 sudo[141600]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:22 compute-0 sudo[141752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pincwvsitmtayenrroprfwixfxswokwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758202.530321-1002-5281940630186/AnsiballZ_file.py'
Oct 06 13:43:22 compute-0 sudo[141752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:23 compute-0 python3.9[141754]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:43:23 compute-0 sudo[141752]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:23 compute-0 sudo[141904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqyqxxtpckmswonnnpgfumsivycvezhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758203.1764278-1002-207552139989941/AnsiballZ_file.py'
Oct 06 13:43:23 compute-0 sudo[141904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:23 compute-0 python3.9[141906]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:43:23 compute-0 sudo[141904]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:24 compute-0 sudo[142056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztidooxrbxhhsdrfgltvzagehyydvwjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758203.891034-1088-238183442416931/AnsiballZ_stat.py'
Oct 06 13:43:24 compute-0 sudo[142056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:24 compute-0 python3.9[142058]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:24 compute-0 sudo[142056]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:25 compute-0 sudo[142181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osxfmqxpgcqhtwaohnqyvqnnzyabxffp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758203.891034-1088-238183442416931/AnsiballZ_copy.py'
Oct 06 13:43:25 compute-0 sudo[142181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:25 compute-0 python3.9[142183]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759758203.891034-1088-238183442416931/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:25 compute-0 sudo[142181]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:25 compute-0 sudo[142333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjyaknqtymjitmdbuvwxgkferogqrkfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758205.5042293-1088-205733737882957/AnsiballZ_stat.py'
Oct 06 13:43:25 compute-0 sudo[142333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:26 compute-0 python3.9[142335]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:26 compute-0 sudo[142333]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:26 compute-0 sudo[142458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojvmbgvpcwgbprgoshqttubqucmqiajs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758205.5042293-1088-205733737882957/AnsiballZ_copy.py'
Oct 06 13:43:26 compute-0 sudo[142458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:26 compute-0 python3.9[142460]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759758205.5042293-1088-205733737882957/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:26 compute-0 sudo[142458]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:27 compute-0 sudo[142610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gipdnbnqvhefixkkhcjszaytlrpvfemf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758206.9157069-1088-235699596344131/AnsiballZ_stat.py'
Oct 06 13:43:27 compute-0 sudo[142610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:27 compute-0 python3.9[142612]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:27 compute-0 sudo[142610]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:27 compute-0 sudo[142735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkvjxsxphsidfojqsaezzgjvfqonciym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758206.9157069-1088-235699596344131/AnsiballZ_copy.py'
Oct 06 13:43:27 compute-0 sudo[142735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:28 compute-0 python3.9[142737]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759758206.9157069-1088-235699596344131/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:28 compute-0 sudo[142735]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:28 compute-0 sudo[142887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxzzulfajtzecktihpswvllfynqqfzaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758208.427856-1088-262101274955157/AnsiballZ_stat.py'
Oct 06 13:43:28 compute-0 sudo[142887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:29 compute-0 python3.9[142889]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:29 compute-0 sudo[142887]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:29 compute-0 sudo[143012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkwynlnzloouaqzbjzfatlfyfbgnxxbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758208.427856-1088-262101274955157/AnsiballZ_copy.py'
Oct 06 13:43:29 compute-0 sudo[143012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:29 compute-0 python3.9[143014]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759758208.427856-1088-262101274955157/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:29 compute-0 sudo[143012]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:30 compute-0 sudo[143164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djyaawrdmzanejbjzgpmulgzaikprkhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758210.111344-1088-128646679187343/AnsiballZ_stat.py'
Oct 06 13:43:30 compute-0 sudo[143164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:30 compute-0 python3.9[143166]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:30 compute-0 sudo[143164]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:31 compute-0 sudo[143289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fccadiqdeoudykbrggnoqnlosdruvlhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758210.111344-1088-128646679187343/AnsiballZ_copy.py'
Oct 06 13:43:31 compute-0 sudo[143289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:31 compute-0 python3.9[143291]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759758210.111344-1088-128646679187343/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:31 compute-0 sudo[143289]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:31 compute-0 sudo[143441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxckwbbjedeaioydkwimmvqurduwxgcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758211.498122-1088-270221634592038/AnsiballZ_stat.py'
Oct 06 13:43:31 compute-0 sudo[143441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:32 compute-0 python3.9[143443]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:32 compute-0 sudo[143441]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:32 compute-0 sudo[143566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwddqcsrgqjqpuhjgtoolajxoberuvch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758211.498122-1088-270221634592038/AnsiballZ_copy.py'
Oct 06 13:43:32 compute-0 sudo[143566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:32 compute-0 python3.9[143568]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759758211.498122-1088-270221634592038/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:32 compute-0 sudo[143566]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:33 compute-0 sudo[143718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phkzjbebbdaxmgenjkoctwvecohbmvmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758212.9325466-1088-280595296782832/AnsiballZ_stat.py'
Oct 06 13:43:33 compute-0 sudo[143718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:33 compute-0 python3.9[143720]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:33 compute-0 sudo[143718]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:33 compute-0 sudo[143841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnxhwpizoohyxlbztzyljmttftuvjlzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758212.9325466-1088-280595296782832/AnsiballZ_copy.py'
Oct 06 13:43:33 compute-0 sudo[143841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:34 compute-0 python3.9[143843]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759758212.9325466-1088-280595296782832/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:34 compute-0 sudo[143841]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:34 compute-0 sudo[143993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srqfuqesmndoidxbqdhuxzdjqufaxevy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758214.3408728-1088-128496968112370/AnsiballZ_stat.py'
Oct 06 13:43:34 compute-0 sudo[143993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:34 compute-0 python3.9[143995]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:34 compute-0 sudo[143993]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:35 compute-0 sudo[144118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oofxmzehmxgpotxcsrobdqcqismzitqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758214.3408728-1088-128496968112370/AnsiballZ_copy.py'
Oct 06 13:43:35 compute-0 sudo[144118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:35 compute-0 python3.9[144120]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759758214.3408728-1088-128496968112370/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:35 compute-0 sudo[144118]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:36 compute-0 sudo[144270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dolbweojhfnvsqqkoulbgpcflschjewj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758215.9094594-1314-119431340593200/AnsiballZ_command.py'
Oct 06 13:43:36 compute-0 sudo[144270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:36 compute-0 python3.9[144272]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct 06 13:43:36 compute-0 sudo[144270]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:37 compute-0 sudo[144423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxdfgjrouuptzllyydprpcvwpxsksvvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758216.7475946-1332-13378916602138/AnsiballZ_file.py'
Oct 06 13:43:37 compute-0 sudo[144423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:37 compute-0 python3.9[144425]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:37 compute-0 sudo[144423]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:37 compute-0 sudo[144575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxwdqpbversaisyhdolostbftnyajbxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758217.48881-1332-241854017085805/AnsiballZ_file.py'
Oct 06 13:43:37 compute-0 sudo[144575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:37 compute-0 python3.9[144577]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:38 compute-0 sudo[144575]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:38 compute-0 sudo[144727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gldpmqnmnjueojemfkzgyvgluyocufjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758218.18891-1332-151790395989008/AnsiballZ_file.py'
Oct 06 13:43:38 compute-0 sudo[144727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:38 compute-0 python3.9[144729]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:38 compute-0 sudo[144727]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:39 compute-0 sudo[144879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atgehalktcdkleqcoocoejfyieygwjns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758218.946541-1332-276351585360207/AnsiballZ_file.py'
Oct 06 13:43:39 compute-0 sudo[144879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:39 compute-0 python3.9[144881]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:39 compute-0 sudo[144879]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:39 compute-0 sudo[145031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqhqkfmwrzjnvbwtybpfqfgcpcorfcrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758219.604622-1332-119226108436673/AnsiballZ_file.py'
Oct 06 13:43:39 compute-0 sudo[145031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:40 compute-0 python3.9[145033]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:40 compute-0 sudo[145031]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:40 compute-0 sudo[145183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esqxanjszbdzxdzuyuzxucaikwnixfaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758220.5056043-1332-149805011864132/AnsiballZ_file.py'
Oct 06 13:43:40 compute-0 sudo[145183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:41 compute-0 python3.9[145185]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:41 compute-0 sudo[145183]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:41 compute-0 sudo[145335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eietkjgdtulutiksiitylinfjrwrxjmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758221.3186762-1332-32709072052883/AnsiballZ_file.py'
Oct 06 13:43:41 compute-0 sudo[145335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:41 compute-0 python3.9[145337]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:41 compute-0 sudo[145335]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:42 compute-0 sudo[145487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhvruuturxntfclrpopxkdxqafxsmivr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758222.0250797-1332-279841687352185/AnsiballZ_file.py'
Oct 06 13:43:42 compute-0 sudo[145487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:42 compute-0 python3.9[145489]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:42 compute-0 sudo[145487]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:43 compute-0 sudo[145660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfmbvsljvvmkjvjosccwuaxlywzvgxws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758222.7714665-1332-95633375208757/AnsiballZ_file.py'
Oct 06 13:43:43 compute-0 sudo[145660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:43 compute-0 podman[145614]: 2025-10-06 13:43:43.110639233 +0000 UTC m=+0.100978210 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 13:43:43 compute-0 podman[145613]: 2025-10-06 13:43:43.146125613 +0000 UTC m=+0.136646135 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:43:43 compute-0 python3.9[145671]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:43 compute-0 sudo[145660]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:43 compute-0 sudo[145832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfxmviqmwwqzdbnclfqstvwiyecagnza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758223.4427736-1332-249338284952389/AnsiballZ_file.py'
Oct 06 13:43:43 compute-0 sudo[145832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:43 compute-0 python3.9[145834]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:43 compute-0 sudo[145832]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:44 compute-0 sudo[145984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blxkjpvpmvcgtuvlaihhnxkxefvovlgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758224.147128-1332-99653489462847/AnsiballZ_file.py'
Oct 06 13:43:44 compute-0 sudo[145984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:44 compute-0 python3.9[145986]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:44 compute-0 sudo[145984]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:45 compute-0 sudo[146136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcjexhdaqmtimjhaaddmkygrssfpqtxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758224.942992-1332-27856038065670/AnsiballZ_file.py'
Oct 06 13:43:45 compute-0 sudo[146136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:45 compute-0 python3.9[146138]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:45 compute-0 sudo[146136]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:46 compute-0 sudo[146288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sglygkqkzupfgvmqbelkduefmwasrghg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758225.635802-1332-271701502444164/AnsiballZ_file.py'
Oct 06 13:43:46 compute-0 sudo[146288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:46 compute-0 python3.9[146290]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:46 compute-0 sudo[146288]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:46 compute-0 sudo[146440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmpghigyrtzfwbsyjrjercdasgmmyaus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758226.4013865-1332-46017206313759/AnsiballZ_file.py'
Oct 06 13:43:46 compute-0 sudo[146440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:46 compute-0 python3.9[146442]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:46 compute-0 sudo[146440]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:47 compute-0 sudo[146592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwrqcblwnhfxxwtujugmjnkadtctjmrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758227.1841493-1530-28074320695737/AnsiballZ_stat.py'
Oct 06 13:43:47 compute-0 sudo[146592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:47 compute-0 python3.9[146594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:47 compute-0 sudo[146592]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:48 compute-0 sudo[146715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfxmofdhrafzfezzvuwjcnapskeqngwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758227.1841493-1530-28074320695737/AnsiballZ_copy.py'
Oct 06 13:43:48 compute-0 sudo[146715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:48 compute-0 python3.9[146717]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758227.1841493-1530-28074320695737/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:48 compute-0 sudo[146715]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:48 compute-0 sudo[146867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubllqgksvqjwsdcklqwwfvmssewmtpfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758228.5824473-1530-141098986750287/AnsiballZ_stat.py'
Oct 06 13:43:48 compute-0 sudo[146867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:49 compute-0 python3.9[146869]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:49 compute-0 sudo[146867]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:49 compute-0 sudo[146990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzzmmfedfyqmypbttbqlbadltvkjmgro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758228.5824473-1530-141098986750287/AnsiballZ_copy.py'
Oct 06 13:43:49 compute-0 sudo[146990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:49 compute-0 python3.9[146992]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758228.5824473-1530-141098986750287/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:49 compute-0 sudo[146990]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:50 compute-0 sudo[147142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbnggowvvorcrjpcolwgteynixzwmxxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758230.006705-1530-164822002390053/AnsiballZ_stat.py'
Oct 06 13:43:50 compute-0 sudo[147142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:50 compute-0 python3.9[147144]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:50 compute-0 sudo[147142]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:51 compute-0 sudo[147265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcsauzvnwhvssvwpyfploipwkxoywybr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758230.006705-1530-164822002390053/AnsiballZ_copy.py'
Oct 06 13:43:51 compute-0 sudo[147265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:51 compute-0 python3.9[147267]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758230.006705-1530-164822002390053/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:51 compute-0 sudo[147265]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:51 compute-0 sudo[147417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvwtmoxxrakyermvmpkxczoawucwukmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758231.385611-1530-281157687747723/AnsiballZ_stat.py'
Oct 06 13:43:51 compute-0 sudo[147417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:51 compute-0 python3.9[147419]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:51 compute-0 sudo[147417]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:52 compute-0 sudo[147540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omfuqtesbakznxptnsvrvemfgebqjsog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758231.385611-1530-281157687747723/AnsiballZ_copy.py'
Oct 06 13:43:52 compute-0 sudo[147540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:52 compute-0 python3.9[147542]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758231.385611-1530-281157687747723/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:52 compute-0 sudo[147540]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:53 compute-0 sudo[147692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoygzpwsjrqtnjihkiraoqqnfvqhghol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758232.7629778-1530-245965851933546/AnsiballZ_stat.py'
Oct 06 13:43:53 compute-0 sudo[147692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:53 compute-0 python3.9[147694]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:53 compute-0 sudo[147692]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:53 compute-0 sudo[147815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbemylufdcqlfrnqbhapofulftwkfxnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758232.7629778-1530-245965851933546/AnsiballZ_copy.py'
Oct 06 13:43:53 compute-0 sudo[147815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:53 compute-0 python3.9[147817]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758232.7629778-1530-245965851933546/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:53 compute-0 sudo[147815]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:54 compute-0 sudo[147967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvdchscskttewxpsqkxgbzyfaboujkte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758234.0687227-1530-99817907747421/AnsiballZ_stat.py'
Oct 06 13:43:54 compute-0 sudo[147967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:54 compute-0 python3.9[147969]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:54 compute-0 sudo[147967]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:55 compute-0 sudo[148090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpsmxexoevtmvgeqdvbohuuqzepljede ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758234.0687227-1530-99817907747421/AnsiballZ_copy.py'
Oct 06 13:43:55 compute-0 sudo[148090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:55 compute-0 python3.9[148092]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758234.0687227-1530-99817907747421/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:55 compute-0 sudo[148090]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:55 compute-0 sudo[148242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owhwizedwfehnksterdhexzcnfjowlpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758235.4689603-1530-107585485342185/AnsiballZ_stat.py'
Oct 06 13:43:55 compute-0 sudo[148242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:56 compute-0 python3.9[148244]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:56 compute-0 sudo[148242]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:56 compute-0 sudo[148365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwzfnfuxwdkoocysswpnwchszjlrizmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758235.4689603-1530-107585485342185/AnsiballZ_copy.py'
Oct 06 13:43:56 compute-0 sudo[148365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:56 compute-0 python3.9[148367]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758235.4689603-1530-107585485342185/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:56 compute-0 sudo[148365]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:57 compute-0 sudo[148517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krnwrctpjhhdsrdtabtbvcjirxmeilmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758237.035713-1530-220290112943223/AnsiballZ_stat.py'
Oct 06 13:43:57 compute-0 sudo[148517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:57 compute-0 python3.9[148519]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:57 compute-0 sudo[148517]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:57 compute-0 sudo[148640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izrdxrwmjmsznsqeicloqsojxcmotnyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758237.035713-1530-220290112943223/AnsiballZ_copy.py'
Oct 06 13:43:57 compute-0 sudo[148640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:58 compute-0 python3.9[148642]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758237.035713-1530-220290112943223/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:58 compute-0 sudo[148640]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:58 compute-0 sudo[148792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amfcdhqagnjqomlgqzhktmzwbkxpmotg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758238.405561-1530-200324153263987/AnsiballZ_stat.py'
Oct 06 13:43:58 compute-0 sudo[148792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:58 compute-0 python3.9[148794]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:43:58 compute-0 sudo[148792]: pam_unix(sudo:session): session closed for user root
Oct 06 13:43:59 compute-0 sudo[148915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qphnvdgsbbtjmdhcbluzvysmbgtdupvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758238.405561-1530-200324153263987/AnsiballZ_copy.py'
Oct 06 13:43:59 compute-0 sudo[148915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:43:59 compute-0 python3.9[148917]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758238.405561-1530-200324153263987/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:43:59 compute-0 sudo[148915]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:00 compute-0 sudo[149067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meddbucdamdzeiwrjtsvduemgxegierb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758239.7718167-1530-195064397297265/AnsiballZ_stat.py'
Oct 06 13:44:00 compute-0 sudo[149067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:00 compute-0 python3.9[149069]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:00 compute-0 sudo[149067]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:00 compute-0 sudo[149190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqflozivhdxilcrnqdwpltfkqstozoyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758239.7718167-1530-195064397297265/AnsiballZ_copy.py'
Oct 06 13:44:00 compute-0 sudo[149190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:00 compute-0 python3.9[149192]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758239.7718167-1530-195064397297265/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:00 compute-0 sudo[149190]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:01 compute-0 sudo[149342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fanhwcnreuzyavkgengfbarwatgbaxvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758241.126463-1530-77451668396264/AnsiballZ_stat.py'
Oct 06 13:44:01 compute-0 sudo[149342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:01 compute-0 python3.9[149344]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:01 compute-0 sudo[149342]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:02 compute-0 sudo[149465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubzmlcrozguiuuyjwlbascxtsleynoft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758241.126463-1530-77451668396264/AnsiballZ_copy.py'
Oct 06 13:44:02 compute-0 sudo[149465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:02 compute-0 python3.9[149467]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758241.126463-1530-77451668396264/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:02 compute-0 sudo[149465]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:02 compute-0 sudo[149617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvrfcctasivsxvfbdoiifapozondijrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758242.4377575-1530-198825361755598/AnsiballZ_stat.py'
Oct 06 13:44:02 compute-0 sudo[149617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:02 compute-0 python3.9[149619]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:02 compute-0 sudo[149617]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:03 compute-0 sudo[149740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-navsksifuxpozonurvmgefrizqjxguko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758242.4377575-1530-198825361755598/AnsiballZ_copy.py'
Oct 06 13:44:03 compute-0 sudo[149740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:03 compute-0 python3.9[149742]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758242.4377575-1530-198825361755598/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:03 compute-0 sudo[149740]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:04 compute-0 sudo[149892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbrfzmrtvitsjczpodqoymlxgfxmsfyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758243.8481848-1530-82629680393130/AnsiballZ_stat.py'
Oct 06 13:44:04 compute-0 sudo[149892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:04 compute-0 python3.9[149894]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:04 compute-0 sudo[149892]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:04 compute-0 sudo[150015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwbvbscvhsyxtvpkfcywhrcnnbnhiybp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758243.8481848-1530-82629680393130/AnsiballZ_copy.py'
Oct 06 13:44:04 compute-0 sudo[150015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:05 compute-0 python3.9[150017]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758243.8481848-1530-82629680393130/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:05 compute-0 sudo[150015]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:05 compute-0 sudo[150167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtvauwolizskexwbyqubgfoxcohqvtjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758245.2096756-1530-183521515559750/AnsiballZ_stat.py'
Oct 06 13:44:05 compute-0 sudo[150167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:05 compute-0 python3.9[150169]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:05 compute-0 sudo[150167]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:06 compute-0 sudo[150290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvxxweecxcmgpkoqkbafpxgxfqhfhtjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758245.2096756-1530-183521515559750/AnsiballZ_copy.py'
Oct 06 13:44:06 compute-0 sudo[150290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:06 compute-0 python3.9[150292]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758245.2096756-1530-183521515559750/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:06 compute-0 sudo[150290]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:07 compute-0 python3.9[150442]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:44:07 compute-0 sudo[150595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stnievghpywivnrmrvhbdrgepmdjsaxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758247.4482937-1942-251748564107579/AnsiballZ_seboolean.py'
Oct 06 13:44:07 compute-0 sudo[150595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:08 compute-0 python3.9[150597]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct 06 13:44:09 compute-0 sudo[150595]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:10 compute-0 sudo[150751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttramewozqwycymkzmyljsaggyeaueni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758249.7656844-1958-271816789592227/AnsiballZ_copy.py'
Oct 06 13:44:10 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct 06 13:44:10 compute-0 sudo[150751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:10 compute-0 python3.9[150753]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:10 compute-0 sudo[150751]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:10 compute-0 sudo[150903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxxtfydqgslieaayfigxeaeosteifoog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758250.5546997-1958-102623459073742/AnsiballZ_copy.py'
Oct 06 13:44:10 compute-0 sudo[150903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:11 compute-0 python3.9[150905]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:11 compute-0 sudo[150903]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:44:11.313 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:44:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:44:11.314 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:44:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:44:11.314 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:44:11 compute-0 sudo[151056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swicbwjkkwaozxzsahtuctkxagfltwzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758251.3587565-1958-233735920522806/AnsiballZ_copy.py'
Oct 06 13:44:11 compute-0 sudo[151056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:11 compute-0 python3.9[151058]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:11 compute-0 sudo[151056]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:12 compute-0 sudo[151208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thswlesrbujpzvfqnpsatypmmardbvzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758252.085054-1958-143020300106300/AnsiballZ_copy.py'
Oct 06 13:44:12 compute-0 sudo[151208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:12 compute-0 python3.9[151210]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:12 compute-0 sudo[151208]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:13 compute-0 sudo[151360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghofshjsbwsvppizvpbberusycjykdkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758252.780373-1958-111479543069842/AnsiballZ_copy.py'
Oct 06 13:44:13 compute-0 sudo[151360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:13 compute-0 python3.9[151362]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:13 compute-0 sudo[151360]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:13 compute-0 sudo[151543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbzjaeayxbxjoqisrhthkhkgxeelvhdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758253.5568-2030-55075691472242/AnsiballZ_copy.py'
Oct 06 13:44:13 compute-0 sudo[151543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:13 compute-0 podman[151487]: 2025-10-06 13:44:13.90681194 +0000 UTC m=+0.075418212 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 06 13:44:13 compute-0 podman[151486]: 2025-10-06 13:44:13.957904036 +0000 UTC m=+0.126493768 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:44:14 compute-0 python3.9[151552]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:14 compute-0 sudo[151543]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:14 compute-0 sudo[151708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erpotmdgrneswgdluigprigsnniurkgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758254.2956326-2030-158985380777037/AnsiballZ_copy.py'
Oct 06 13:44:14 compute-0 sudo[151708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:14 compute-0 python3.9[151710]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:14 compute-0 sudo[151708]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:15 compute-0 sudo[151860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwscasonwdembvtkwfrewkbpyqlkqsyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758255.0566418-2030-263269043845010/AnsiballZ_copy.py'
Oct 06 13:44:15 compute-0 sudo[151860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:15 compute-0 python3.9[151862]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:15 compute-0 sudo[151860]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:16 compute-0 sudo[152012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxbxuififyqxfaqoyqqcojexiobarvqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758255.8612313-2030-132367263617464/AnsiballZ_copy.py'
Oct 06 13:44:16 compute-0 sudo[152012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:16 compute-0 python3.9[152014]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:16 compute-0 sudo[152012]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:17 compute-0 sudo[152164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdlqahjooikrnaymqmvuasiwgemexciy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758256.6792054-2030-45239680890366/AnsiballZ_copy.py'
Oct 06 13:44:17 compute-0 sudo[152164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:17 compute-0 python3.9[152166]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:17 compute-0 sudo[152164]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:18 compute-0 sudo[152316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mphpzxrijrmvtncvfkpjysgyrbnukoth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758257.6048703-2102-106583026257068/AnsiballZ_systemd.py'
Oct 06 13:44:18 compute-0 sudo[152316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:18 compute-0 python3.9[152318]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:44:18 compute-0 systemd[1]: Reloading.
Oct 06 13:44:18 compute-0 systemd-sysv-generator[152343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:44:18 compute-0 systemd-rc-local-generator[152339]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:44:18 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Oct 06 13:44:18 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Oct 06 13:44:18 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Oct 06 13:44:18 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct 06 13:44:18 compute-0 systemd[1]: Starting libvirt logging daemon...
Oct 06 13:44:18 compute-0 systemd[1]: Started libvirt logging daemon.
Oct 06 13:44:18 compute-0 sudo[152316]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:19 compute-0 sudo[152509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiilkcbunweaoflsmyabivypwilnassn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758259.122978-2102-19368342753045/AnsiballZ_systemd.py'
Oct 06 13:44:19 compute-0 sudo[152509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:19 compute-0 python3.9[152511]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:44:19 compute-0 systemd[1]: Reloading.
Oct 06 13:44:19 compute-0 systemd-sysv-generator[152539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:44:19 compute-0 systemd-rc-local-generator[152532]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:44:20 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Oct 06 13:44:20 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct 06 13:44:20 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct 06 13:44:20 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct 06 13:44:20 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct 06 13:44:20 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct 06 13:44:20 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 06 13:44:20 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 06 13:44:20 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct 06 13:44:20 compute-0 sudo[152509]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:20 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct 06 13:44:20 compute-0 sudo[152725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfwwabmxmsbzfryfrjgeebersedwauqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758260.242624-2102-264011465773581/AnsiballZ_systemd.py'
Oct 06 13:44:20 compute-0 sudo[152725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:20 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct 06 13:44:20 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct 06 13:44:20 compute-0 python3.9[152728]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:44:20 compute-0 systemd[1]: Reloading.
Oct 06 13:44:20 compute-0 systemd-sysv-generator[152764]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:44:20 compute-0 systemd-rc-local-generator[152761]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:44:21 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct 06 13:44:21 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct 06 13:44:21 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct 06 13:44:21 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct 06 13:44:21 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 06 13:44:21 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 06 13:44:21 compute-0 sudo[152725]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:21 compute-0 setroubleshoot[152572]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l c18e47f0-e8e2-41f0-b4a1-84045751e1fe
Oct 06 13:44:21 compute-0 setroubleshoot[152572]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 06 13:44:21 compute-0 setroubleshoot[152572]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l c18e47f0-e8e2-41f0-b4a1-84045751e1fe
Oct 06 13:44:21 compute-0 setroubleshoot[152572]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 06 13:44:21 compute-0 sudo[152943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exdtvtkumnjphmiqceuaezksazpdfbzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758261.3420308-2102-41085100571821/AnsiballZ_systemd.py'
Oct 06 13:44:21 compute-0 sudo[152943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:22 compute-0 python3.9[152945]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:44:22 compute-0 systemd[1]: Reloading.
Oct 06 13:44:22 compute-0 systemd-rc-local-generator[152974]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:44:22 compute-0 systemd-sysv-generator[152978]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:44:22 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Oct 06 13:44:22 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Oct 06 13:44:22 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct 06 13:44:22 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct 06 13:44:22 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct 06 13:44:22 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct 06 13:44:22 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct 06 13:44:22 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct 06 13:44:22 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct 06 13:44:22 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct 06 13:44:22 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 06 13:44:22 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 06 13:44:22 compute-0 sudo[152943]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:22 compute-0 sudo[153156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sanmadlavlyvumpgzeotdcbmbketilqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758262.6057942-2102-260147389637666/AnsiballZ_systemd.py'
Oct 06 13:44:22 compute-0 sudo[153156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:23 compute-0 python3.9[153158]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:44:23 compute-0 systemd[1]: Reloading.
Oct 06 13:44:23 compute-0 systemd-rc-local-generator[153188]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:44:23 compute-0 systemd-sysv-generator[153193]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:44:23 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Oct 06 13:44:23 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Oct 06 13:44:23 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Oct 06 13:44:23 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct 06 13:44:23 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct 06 13:44:23 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct 06 13:44:23 compute-0 systemd[1]: Starting libvirt secret daemon...
Oct 06 13:44:23 compute-0 systemd[1]: Started libvirt secret daemon.
Oct 06 13:44:23 compute-0 sudo[153156]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:24 compute-0 sudo[153367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umwdfqhrzspwexoqwkmkcgvdmicztjou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758263.9910502-2176-271038399707141/AnsiballZ_file.py'
Oct 06 13:44:24 compute-0 sudo[153367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:24 compute-0 python3.9[153369]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:24 compute-0 sudo[153367]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:25 compute-0 sudo[153519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yssfdqpgjijwippxcsjyejhoqzgqmhou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758264.6940358-2192-229182455772738/AnsiballZ_find.py'
Oct 06 13:44:25 compute-0 sudo[153519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:25 compute-0 python3.9[153521]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 06 13:44:25 compute-0 sudo[153519]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:26 compute-0 sudo[153671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzmwpnybbqufufkaexuyiswmgvgzcamg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758265.7774243-2220-187372594494743/AnsiballZ_stat.py'
Oct 06 13:44:26 compute-0 sudo[153671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:26 compute-0 python3.9[153673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:26 compute-0 sudo[153671]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:26 compute-0 sudo[153794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oftbwrfwtrhierufgsefyxmcfzdflgnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758265.7774243-2220-187372594494743/AnsiballZ_copy.py'
Oct 06 13:44:26 compute-0 sudo[153794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:27 compute-0 python3.9[153796]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758265.7774243-2220-187372594494743/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:27 compute-0 sudo[153794]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:27 compute-0 sudo[153946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgpqcsgsjiupfysqxxzvjbwsfrvbcffm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758267.5207932-2252-263723540247592/AnsiballZ_file.py'
Oct 06 13:44:27 compute-0 sudo[153946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:28 compute-0 python3.9[153948]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:28 compute-0 sudo[153946]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:28 compute-0 sudo[154098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weafpwfcrfavbwbnmavgbqklpegfjjjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758268.2872434-2268-90184575691964/AnsiballZ_stat.py'
Oct 06 13:44:28 compute-0 sudo[154098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:28 compute-0 python3.9[154100]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:28 compute-0 sudo[154098]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:29 compute-0 sudo[154176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvlbflbjfsesyruwgwxzbrjfcsnbxazs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758268.2872434-2268-90184575691964/AnsiballZ_file.py'
Oct 06 13:44:29 compute-0 sudo[154176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:29 compute-0 python3.9[154178]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:29 compute-0 sudo[154176]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:30 compute-0 sudo[154328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trtajgdchndxveghddgqcsvmfokoqpyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758269.5649102-2292-18600559418817/AnsiballZ_stat.py'
Oct 06 13:44:30 compute-0 sudo[154328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:30 compute-0 python3.9[154330]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:30 compute-0 sudo[154328]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:30 compute-0 sudo[154406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flzfrtztwzcuzizfbijoghktipvajlei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758269.5649102-2292-18600559418817/AnsiballZ_file.py'
Oct 06 13:44:30 compute-0 sudo[154406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:30 compute-0 python3.9[154408]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xphobvs1 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:30 compute-0 sudo[154406]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:31 compute-0 sudo[154558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhyilmyihebrybrvixbxowizkjleyowl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758270.9125-2316-212472372635753/AnsiballZ_stat.py'
Oct 06 13:44:31 compute-0 sudo[154558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:31 compute-0 python3.9[154560]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:31 compute-0 sudo[154558]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:31 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct 06 13:44:31 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct 06 13:44:31 compute-0 sudo[154637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdoeyoecrbndttvbkosamhgvxrodovqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758270.9125-2316-212472372635753/AnsiballZ_file.py'
Oct 06 13:44:31 compute-0 sudo[154637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:32 compute-0 python3.9[154639]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:32 compute-0 sudo[154637]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:32 compute-0 sudo[154789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhxbvokkxshvnvcnzbthjaxhhttrjhxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758272.2753916-2342-280871042574263/AnsiballZ_command.py'
Oct 06 13:44:32 compute-0 sudo[154789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:32 compute-0 python3.9[154791]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:44:32 compute-0 sudo[154789]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:33 compute-0 sudo[154942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkkiucqglgkzpitvwjhnagxafabrmjje ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759758273.0700085-2358-255623182194664/AnsiballZ_edpm_nftables_from_files.py'
Oct 06 13:44:33 compute-0 sudo[154942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:33 compute-0 python3[154944]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 06 13:44:33 compute-0 sudo[154942]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:34 compute-0 sudo[155094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyudugbvatawozrpmxqnsfiofmtnfqlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758274.0855653-2374-146603272002998/AnsiballZ_stat.py'
Oct 06 13:44:34 compute-0 sudo[155094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:34 compute-0 python3.9[155096]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:34 compute-0 sudo[155094]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:35 compute-0 sudo[155172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsdzfzyqtmmotrqxormcqxymolprtjgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758274.0855653-2374-146603272002998/AnsiballZ_file.py'
Oct 06 13:44:35 compute-0 sudo[155172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:35 compute-0 python3.9[155174]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:35 compute-0 sudo[155172]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:35 compute-0 sudo[155324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwyoiifvzjghhgqgnegprjbwkuclwexf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758275.4631124-2398-110167126852471/AnsiballZ_stat.py'
Oct 06 13:44:35 compute-0 sudo[155324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:36 compute-0 python3.9[155326]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:36 compute-0 sudo[155324]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:36 compute-0 sudo[155402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlegkxjouyrmfucsafscylwwpdwkwcbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758275.4631124-2398-110167126852471/AnsiballZ_file.py'
Oct 06 13:44:36 compute-0 sudo[155402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:36 compute-0 python3.9[155404]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:36 compute-0 sudo[155402]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:37 compute-0 sudo[155554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unyhstrxkmijonfscanxlfsqzvunsgvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758276.8288283-2422-60786670865949/AnsiballZ_stat.py'
Oct 06 13:44:37 compute-0 sudo[155554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:37 compute-0 python3.9[155556]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:37 compute-0 sudo[155554]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:37 compute-0 sudo[155632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvgvcoyfbofrwkejbnfnbkhxswlpxnly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758276.8288283-2422-60786670865949/AnsiballZ_file.py'
Oct 06 13:44:37 compute-0 sudo[155632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:37 compute-0 python3.9[155634]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:37 compute-0 sudo[155632]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:38 compute-0 sudo[155784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmqydmrpgozbjixuminpopdhktoeyucz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758278.1273885-2446-90848175813726/AnsiballZ_stat.py'
Oct 06 13:44:38 compute-0 sudo[155784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:38 compute-0 python3.9[155786]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:38 compute-0 sudo[155784]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:39 compute-0 sudo[155862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atclghaqqnbcukgzafmtjtjvyvkmbszw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758278.1273885-2446-90848175813726/AnsiballZ_file.py'
Oct 06 13:44:39 compute-0 sudo[155862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:39 compute-0 python3.9[155864]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:39 compute-0 sudo[155862]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:40 compute-0 sudo[156014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rodmdexroldgyhgflrjztsbsrtppxbzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758279.61846-2470-136744324492013/AnsiballZ_stat.py'
Oct 06 13:44:40 compute-0 sudo[156014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:40 compute-0 python3.9[156016]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:41 compute-0 sudo[156014]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:41 compute-0 sudo[156139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opfiprlqqqsqyscgndfkqdjcoczuolwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758279.61846-2470-136744324492013/AnsiballZ_copy.py'
Oct 06 13:44:41 compute-0 sudo[156139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:41 compute-0 python3.9[156141]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758279.61846-2470-136744324492013/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:42 compute-0 sudo[156139]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:42 compute-0 sudo[156291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajnclzivlmqizltnxniwjylqyshmvahp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758282.21117-2500-250497114446585/AnsiballZ_file.py'
Oct 06 13:44:42 compute-0 sudo[156291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:42 compute-0 python3.9[156293]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:42 compute-0 sudo[156291]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:43 compute-0 sudo[156443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edwydhofylutwcosqaizxnjpcireilfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758282.9629245-2516-144791181990448/AnsiballZ_command.py'
Oct 06 13:44:43 compute-0 sudo[156443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:43 compute-0 python3.9[156445]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:44:43 compute-0 sudo[156443]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:44 compute-0 podman[156549]: 2025-10-06 13:44:44.232289415 +0000 UTC m=+0.086578046 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:44:44 compute-0 sudo[156636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxiqcygnfrdsulbsqaopvgzgvtiqdron ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758283.7716289-2532-172415766437901/AnsiballZ_blockinfile.py'
Oct 06 13:44:44 compute-0 sudo[156636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:44 compute-0 podman[156548]: 2025-10-06 13:44:44.355764859 +0000 UTC m=+0.211975293 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 06 13:44:44 compute-0 python3.9[156640]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:44 compute-0 sudo[156636]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:45 compute-0 sudo[156791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjofkuivrznywekkzaycratvqwjrnhor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758285.0243213-2550-23972900476459/AnsiballZ_command.py'
Oct 06 13:44:45 compute-0 sudo[156791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:45 compute-0 python3.9[156793]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:44:45 compute-0 sudo[156791]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:46 compute-0 sudo[156944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzrjxscpzjihqgycfqwvcwdbhdhrnlkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758285.8061843-2566-255129990885755/AnsiballZ_stat.py'
Oct 06 13:44:46 compute-0 sudo[156944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:46 compute-0 python3.9[156946]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:44:46 compute-0 sudo[156944]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:47 compute-0 sudo[157098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keyfcrvtvnfxikkuphyqvxuszlpkdgok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758286.6535907-2582-133038638282422/AnsiballZ_command.py'
Oct 06 13:44:47 compute-0 sudo[157098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:47 compute-0 python3.9[157100]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:44:47 compute-0 sudo[157098]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:47 compute-0 sudo[157253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bymkjqagivivwstdspvbfstluffybzsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758287.4557683-2598-139306121379046/AnsiballZ_file.py'
Oct 06 13:44:47 compute-0 sudo[157253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:47 compute-0 python3.9[157255]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:47 compute-0 sudo[157253]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:48 compute-0 sudo[157405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aplvhrxldhenqpvzwbaxqqledplmpfds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758288.1944845-2614-8196770871871/AnsiballZ_stat.py'
Oct 06 13:44:48 compute-0 sudo[157405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:48 compute-0 python3.9[157407]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:48 compute-0 sudo[157405]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:49 compute-0 sudo[157528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftiafygbjgnokeoxhlgoiokxokhireas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758288.1944845-2614-8196770871871/AnsiballZ_copy.py'
Oct 06 13:44:49 compute-0 sudo[157528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:49 compute-0 python3.9[157530]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758288.1944845-2614-8196770871871/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:49 compute-0 sudo[157528]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:50 compute-0 sudo[157680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zorusgplryxydftjipnjjrazbffmiavz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758289.8008564-2644-941685052364/AnsiballZ_stat.py'
Oct 06 13:44:50 compute-0 sudo[157680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:50 compute-0 python3.9[157682]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:50 compute-0 sudo[157680]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:50 compute-0 sudo[157803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riyzxsacrmxctpkrjqnrmojsnwbioxwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758289.8008564-2644-941685052364/AnsiballZ_copy.py'
Oct 06 13:44:50 compute-0 sudo[157803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:50 compute-0 python3.9[157805]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758289.8008564-2644-941685052364/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:50 compute-0 sudo[157803]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:51 compute-0 sudo[157955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubudhqiypqkohvexsfpdmwpsdrurjzic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758291.1778495-2674-124593252501052/AnsiballZ_stat.py'
Oct 06 13:44:51 compute-0 sudo[157955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:51 compute-0 python3.9[157957]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:44:51 compute-0 sudo[157955]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:52 compute-0 sudo[158078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqfxsaiqyhzrdwjrjsfitduyouxcfgyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758291.1778495-2674-124593252501052/AnsiballZ_copy.py'
Oct 06 13:44:52 compute-0 sudo[158078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:52 compute-0 python3.9[158080]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758291.1778495-2674-124593252501052/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:44:52 compute-0 sudo[158078]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:52 compute-0 sudo[158230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yihqronywmkiodsxzfzufwgfwdbkzxyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758292.5595407-2704-220250148588830/AnsiballZ_systemd.py'
Oct 06 13:44:52 compute-0 sudo[158230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:53 compute-0 python3.9[158232]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:44:53 compute-0 systemd[1]: Reloading.
Oct 06 13:44:53 compute-0 systemd-rc-local-generator[158261]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:44:53 compute-0 systemd-sysv-generator[158264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:44:53 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Oct 06 13:44:53 compute-0 sudo[158230]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:54 compute-0 sudo[158422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyxjhdmskjytacouqhnuyifipzctqdpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758293.7820122-2720-241088045363301/AnsiballZ_systemd.py'
Oct 06 13:44:54 compute-0 sudo[158422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:44:54 compute-0 python3.9[158424]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 06 13:44:54 compute-0 systemd[1]: Reloading.
Oct 06 13:44:54 compute-0 systemd-rc-local-generator[158446]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:44:54 compute-0 systemd-sysv-generator[158450]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:44:54 compute-0 systemd[1]: Reloading.
Oct 06 13:44:54 compute-0 systemd-sysv-generator[158485]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:44:54 compute-0 systemd-rc-local-generator[158481]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:44:55 compute-0 sudo[158422]: pam_unix(sudo:session): session closed for user root
Oct 06 13:44:55 compute-0 sshd-session[104215]: Connection closed by 192.168.122.30 port 34596
Oct 06 13:44:55 compute-0 sshd-session[104212]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:44:55 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Oct 06 13:44:55 compute-0 systemd[1]: session-24.scope: Consumed 3min 54.954s CPU time.
Oct 06 13:44:55 compute-0 systemd-logind[789]: Session 24 logged out. Waiting for processes to exit.
Oct 06 13:44:55 compute-0 systemd-logind[789]: Removed session 24.
Oct 06 13:45:00 compute-0 sshd-session[158519]: Accepted publickey for zuul from 192.168.122.30 port 60734 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:45:00 compute-0 systemd-logind[789]: New session 25 of user zuul.
Oct 06 13:45:00 compute-0 systemd[1]: Started Session 25 of User zuul.
Oct 06 13:45:00 compute-0 sshd-session[158519]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:45:01 compute-0 python3.9[158672]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:45:03 compute-0 sudo[158826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecfdgahzgqkroefwqtyyvcxijnqnmmal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758302.4331057-48-8273591319317/AnsiballZ_file.py'
Oct 06 13:45:03 compute-0 sudo[158826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:03 compute-0 python3.9[158828]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:03 compute-0 sudo[158826]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:03 compute-0 sudo[158978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxaqwutwqjfbihzxvpgqbqnprzcnkonb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758303.597117-48-194590673126254/AnsiballZ_file.py'
Oct 06 13:45:03 compute-0 sudo[158978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:04 compute-0 python3.9[158980]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:04 compute-0 sudo[158978]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:04 compute-0 sudo[159130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuslalalybdlcgciwzmsxvtidagnliii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758304.400491-48-129265149229943/AnsiballZ_file.py'
Oct 06 13:45:04 compute-0 sudo[159130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:04 compute-0 python3.9[159132]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:04 compute-0 sudo[159130]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:05 compute-0 sudo[159282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylgxdubccgpnhegtguiqlsmjqdenrkit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758305.1385233-48-263788415132559/AnsiballZ_file.py'
Oct 06 13:45:05 compute-0 sudo[159282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:05 compute-0 python3.9[159284]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 06 13:45:05 compute-0 sudo[159282]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:06 compute-0 sudo[159434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgrqxxhzgkmhfemyfihiuhojdnqprkuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758305.911003-48-255835585908080/AnsiballZ_file.py'
Oct 06 13:45:06 compute-0 sudo[159434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:06 compute-0 python3.9[159436]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:06 compute-0 sudo[159434]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:07 compute-0 sudo[159586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uipxrtuystpgqsmgmdsfnnhvbxnpbpcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758306.700689-120-150024460994833/AnsiballZ_stat.py'
Oct 06 13:45:07 compute-0 sudo[159586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:07 compute-0 python3.9[159588]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:45:07 compute-0 sudo[159586]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:08 compute-0 sudo[159740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xarrjneeaxciuzxhxqdodxouhalmekoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758307.566572-136-6012657353845/AnsiballZ_systemd.py'
Oct 06 13:45:08 compute-0 sudo[159740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:08 compute-0 python3.9[159742]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:45:08 compute-0 systemd[1]: Reloading.
Oct 06 13:45:08 compute-0 systemd-sysv-generator[159776]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:45:08 compute-0 systemd-rc-local-generator[159772]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:45:08 compute-0 sudo[159740]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:09 compute-0 sudo[159929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcuhqtonpctjipfnlcvpnoqoczgmmivx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758309.1983106-152-111307429537332/AnsiballZ_service_facts.py'
Oct 06 13:45:09 compute-0 sudo[159929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:09 compute-0 python3.9[159931]: ansible-ansible.builtin.service_facts Invoked
Oct 06 13:45:10 compute-0 network[159948]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 06 13:45:10 compute-0 network[159949]: 'network-scripts' will be removed from distribution in near future.
Oct 06 13:45:10 compute-0 network[159950]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 06 13:45:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:45:11.316 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:45:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:45:11.317 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:45:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:45:11.317 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:45:14 compute-0 sudo[159929]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:15 compute-0 podman[160149]: 2025-10-06 13:45:15.248475374 +0000 UTC m=+0.106293734 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 13:45:15 compute-0 podman[160150]: 2025-10-06 13:45:15.246330024 +0000 UTC m=+0.097671934 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 13:45:15 compute-0 sudo[160265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpqunkvdufnehgkwnhmwdveeqagmxosw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758315.038417-168-20069972314193/AnsiballZ_systemd.py'
Oct 06 13:45:15 compute-0 sudo[160265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:15 compute-0 python3.9[160267]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:45:15 compute-0 systemd[1]: Reloading.
Oct 06 13:45:15 compute-0 systemd-rc-local-generator[160292]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:45:15 compute-0 systemd-sysv-generator[160298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:45:16 compute-0 sudo[160265]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:16 compute-0 python3.9[160454]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:45:17 compute-0 sudo[160604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmvqbrwuybjkagqwnzenwpflsfhjnspf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758317.0374963-202-95990072984609/AnsiballZ_podman_container.py'
Oct 06 13:45:17 compute-0 sudo[160604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:17 compute-0 python3.9[160606]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 06 13:45:18 compute-0 podman[160643]: 2025-10-06 13:45:18.06079974 +0000 UTC m=+0.060072076 container create a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Oct 06 13:45:18 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.0899] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/20)
Oct 06 13:45:18 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 06 13:45:18 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 06 13:45:18 compute-0 kernel: veth0: entered allmulticast mode
Oct 06 13:45:18 compute-0 kernel: veth0: entered promiscuous mode
Oct 06 13:45:18 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 06 13:45:18 compute-0 kernel: podman0: port 1(veth0) entered forwarding state
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1294] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1314] device (veth0): carrier: link connected
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1316] device (podman0): carrier: link connected
Oct 06 13:45:18 compute-0 podman[160643]: 2025-10-06 13:45:18.037615263 +0000 UTC m=+0.036887629 image pull 0b62d011736892703306395462c684fe0dfe1473b0a9397423133e591c417adb 38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 06 13:45:18 compute-0 systemd-udevd[160670]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 13:45:18 compute-0 systemd-udevd[160667]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1676] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1687] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1699] device (podman0): Activation: starting connection 'podman0' (0fdbd1d6-3f43-49d1-9943-9a21c18cbc6a)
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1700] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1705] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1708] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.1711] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 06 13:45:18 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 06 13:45:18 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.2099] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.2102] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.2110] device (podman0): Activation: successful, device activated.
Oct 06 13:45:18 compute-0 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct 06 13:45:18 compute-0 systemd[1]: Started libpod-conmon-a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e.scope.
Oct 06 13:45:18 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:45:18 compute-0 podman[160643]: 2025-10-06 13:45:18.501956226 +0000 UTC m=+0.501255873 container init a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Oct 06 13:45:18 compute-0 podman[160643]: 2025-10-06 13:45:18.514785884 +0000 UTC m=+0.514058250 container start a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:45:18 compute-0 podman[160643]: 2025-10-06 13:45:18.51858792 +0000 UTC m=+0.517860286 container attach a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:45:18 compute-0 iscsid_config[160800]: iqn.1994-05.com.redhat:398c7cb08cd2
Oct 06 13:45:18 compute-0 systemd[1]: libpod-a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e.scope: Deactivated successfully.
Oct 06 13:45:18 compute-0 podman[160643]: 2025-10-06 13:45:18.521169462 +0000 UTC m=+0.520441818 container died a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:45:18 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 06 13:45:18 compute-0 kernel: veth0 (unregistering): left allmulticast mode
Oct 06 13:45:18 compute-0 kernel: veth0 (unregistering): left promiscuous mode
Oct 06 13:45:18 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 06 13:45:18 compute-0 NetworkManager[52035]: <info>  [1759758318.5900] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 13:45:18 compute-0 systemd[1]: run-netns-netns\x2d68ea022a\x2d0930\x2d2727\x2d0cdb\x2dea3c0509750b.mount: Deactivated successfully.
Oct 06 13:45:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e-userdata-shm.mount: Deactivated successfully.
Oct 06 13:45:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-7633851b21045aa0702cd363c03cb9738d0ca2e3e284a49d933f3b101cae2c8d-merged.mount: Deactivated successfully.
Oct 06 13:45:18 compute-0 podman[160643]: 2025-10-06 13:45:18.923572239 +0000 UTC m=+0.922844585 container remove a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 13:45:18 compute-0 systemd[1]: libpod-conmon-a743cd1fbf57017c86d90fcbe7ee45e35d97957c71eea2ff1ba5530e7360da1e.scope: Deactivated successfully.
Oct 06 13:45:18 compute-0 python3.9[160606]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True 38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest /usr/sbin/iscsi-iname
Oct 06 13:45:19 compute-0 python3.9[160606]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct 06 13:45:19 compute-0 sudo[160604]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:19 compute-0 sudo[161038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcrcaidyxudgktklsjtykdojypkpkebe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758319.3125758-218-98266788567327/AnsiballZ_stat.py'
Oct 06 13:45:19 compute-0 sudo[161038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:19 compute-0 python3.9[161040]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:45:19 compute-0 sudo[161038]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:20 compute-0 sudo[161161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bezbwoeyfpkixrypmuluenuyjgsglfcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758319.3125758-218-98266788567327/AnsiballZ_copy.py'
Oct 06 13:45:20 compute-0 sudo[161161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:20 compute-0 python3.9[161163]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758319.3125758-218-98266788567327/.source.iscsi _original_basename=.2w4_yp22 follow=False checksum=6e32039edfaeeed1b4a101b9315600b54d60e23b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:20 compute-0 sudo[161161]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:21 compute-0 sudo[161313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akfyldfmheogssylldvzgflpkzulbpvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758320.7999494-248-69160162147959/AnsiballZ_file.py'
Oct 06 13:45:21 compute-0 sudo[161313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:21 compute-0 python3.9[161315]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:21 compute-0 sudo[161313]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:21 compute-0 python3.9[161465]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:45:22 compute-0 sudo[161617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooitqvuzsofxvsfwxkqdfqzsjvdgwvyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758322.2529798-282-70957156096730/AnsiballZ_lineinfile.py'
Oct 06 13:45:22 compute-0 sudo[161617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:22 compute-0 python3.9[161619]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:22 compute-0 sudo[161617]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:23 compute-0 sudo[161769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prdbyeucrqivxejeqprpgwiyaufcwjac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758323.1690884-300-265572511864112/AnsiballZ_file.py'
Oct 06 13:45:23 compute-0 sudo[161769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:23 compute-0 python3.9[161771]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:23 compute-0 sudo[161769]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:24 compute-0 sudo[161921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmvvzitnzvfpzepootcxfjukqfsgwypw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758323.8596718-316-250603696082702/AnsiballZ_stat.py'
Oct 06 13:45:24 compute-0 sudo[161921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:24 compute-0 python3.9[161923]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:45:24 compute-0 sudo[161921]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:24 compute-0 sudo[161999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqxrijvocgrqqngqpyowjgelqvbvrxyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758323.8596718-316-250603696082702/AnsiballZ_file.py'
Oct 06 13:45:24 compute-0 sudo[161999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:24 compute-0 python3.9[162001]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:24 compute-0 sudo[161999]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:25 compute-0 sudo[162151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlbntsjnesymwcpgokcuisdbxdjzvpgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758325.0374374-316-93582878805810/AnsiballZ_stat.py'
Oct 06 13:45:25 compute-0 sudo[162151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:25 compute-0 python3.9[162153]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:45:25 compute-0 sudo[162151]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:25 compute-0 sudo[162229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzngcxkpuqreovytjwxpthfulwjpvfnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758325.0374374-316-93582878805810/AnsiballZ_file.py'
Oct 06 13:45:25 compute-0 sudo[162229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:25 compute-0 python3.9[162231]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:25 compute-0 sudo[162229]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:26 compute-0 sudo[162381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opspcytlyovlrwchwfovuugwpnapshel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758326.1993718-362-253408162257802/AnsiballZ_file.py'
Oct 06 13:45:26 compute-0 sudo[162381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:26 compute-0 python3.9[162383]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:26 compute-0 sudo[162381]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:27 compute-0 sudo[162533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qizfiayxhitbtvdwbkfnpspifqbfbcfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758326.954571-378-135801221349539/AnsiballZ_stat.py'
Oct 06 13:45:27 compute-0 sudo[162533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:27 compute-0 python3.9[162535]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:45:27 compute-0 sudo[162533]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:27 compute-0 sudo[162611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smxjatiqwpueoyjczbvxsfdpkslqcymt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758326.954571-378-135801221349539/AnsiballZ_file.py'
Oct 06 13:45:27 compute-0 sudo[162611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:27 compute-0 python3.9[162613]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:28 compute-0 sudo[162611]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:28 compute-0 sudo[162763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcglcejkketunefciqtwikqyooeuwbfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758328.2327495-402-136040196687815/AnsiballZ_stat.py'
Oct 06 13:45:28 compute-0 sudo[162763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:28 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 06 13:45:28 compute-0 python3.9[162765]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:45:28 compute-0 sudo[162763]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:29 compute-0 sudo[162842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrlbofbltidwjovimsfhhazkuvhxzlfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758328.2327495-402-136040196687815/AnsiballZ_file.py'
Oct 06 13:45:29 compute-0 sudo[162842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:29 compute-0 python3.9[162844]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:29 compute-0 sudo[162842]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:29 compute-0 sudo[162994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egzmrgnmmghoehmicybffdcinekrbkuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758329.5049293-426-132713769299466/AnsiballZ_systemd.py'
Oct 06 13:45:29 compute-0 sudo[162994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:30 compute-0 python3.9[162996]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:45:30 compute-0 systemd[1]: Reloading.
Oct 06 13:45:30 compute-0 systemd-sysv-generator[163021]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:45:30 compute-0 systemd-rc-local-generator[163018]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:45:30 compute-0 sudo[162994]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:31 compute-0 sudo[163182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awctjfxzduentdvvoqcbsphyccpqtvjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758330.7407978-442-10754347064496/AnsiballZ_stat.py'
Oct 06 13:45:31 compute-0 sudo[163182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:31 compute-0 python3.9[163184]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:45:31 compute-0 sudo[163182]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:31 compute-0 sudo[163260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxqcggimervnayljbamyqulqsyociran ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758330.7407978-442-10754347064496/AnsiballZ_file.py'
Oct 06 13:45:31 compute-0 sudo[163260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:31 compute-0 python3.9[163262]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:31 compute-0 sudo[163260]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:32 compute-0 sudo[163412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvxvkobdirmtvsqwkrzmnsypoprkcspq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758332.0570009-466-26598453304826/AnsiballZ_stat.py'
Oct 06 13:45:32 compute-0 sudo[163412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:32 compute-0 python3.9[163414]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:45:32 compute-0 sudo[163412]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:32 compute-0 sudo[163490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thjaegpxhvxrbpuujiescmvvmmihjtlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758332.0570009-466-26598453304826/AnsiballZ_file.py'
Oct 06 13:45:32 compute-0 sudo[163490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:33 compute-0 python3.9[163492]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:33 compute-0 sudo[163490]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:33 compute-0 sudo[163642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdhqsmrrbjnnxdernuiuklahudmloykm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758333.196455-490-238614937323197/AnsiballZ_systemd.py'
Oct 06 13:45:33 compute-0 sudo[163642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:33 compute-0 python3.9[163644]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:45:33 compute-0 systemd[1]: Reloading.
Oct 06 13:45:34 compute-0 systemd-rc-local-generator[163674]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:45:34 compute-0 systemd-sysv-generator[163677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:45:34 compute-0 systemd[1]: Starting Create netns directory...
Oct 06 13:45:34 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 06 13:45:34 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 06 13:45:34 compute-0 systemd[1]: Finished Create netns directory.
Oct 06 13:45:34 compute-0 sudo[163642]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:34 compute-0 sudo[163837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-burjcwgfgsafjdqulonnnmjkyrejzknb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758334.6326082-510-105523049698661/AnsiballZ_file.py'
Oct 06 13:45:34 compute-0 sudo[163837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:35 compute-0 python3.9[163839]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:35 compute-0 sudo[163837]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:35 compute-0 sudo[163989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxdcttnblzpzkrsprsdaxkibkzqlgdeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758335.4440114-526-247114221422709/AnsiballZ_stat.py'
Oct 06 13:45:35 compute-0 sudo[163989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:35 compute-0 python3.9[163991]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:45:35 compute-0 sudo[163989]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:36 compute-0 sudo[164112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmvysljfxmrzmyxcongixdymcxxikcnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758335.4440114-526-247114221422709/AnsiballZ_copy.py'
Oct 06 13:45:36 compute-0 sudo[164112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:36 compute-0 python3.9[164114]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758335.4440114-526-247114221422709/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:36 compute-0 sudo[164112]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:37 compute-0 sudo[164264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddfamntghyruawiohhhgabtyzxwvpieb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758337.0314631-560-5591663641857/AnsiballZ_file.py'
Oct 06 13:45:37 compute-0 sudo[164264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:37 compute-0 python3.9[164266]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:45:37 compute-0 sudo[164264]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:38 compute-0 sudo[164416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eykbxgwzupgvdzmnwtsptzbgqqgznbxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758337.7635202-576-190472951105606/AnsiballZ_stat.py'
Oct 06 13:45:38 compute-0 sudo[164416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:38 compute-0 python3.9[164418]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:45:38 compute-0 sudo[164416]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:38 compute-0 sudo[164539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbtfnqfwvzqvxwrmhfcucovqcodgxiyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758337.7635202-576-190472951105606/AnsiballZ_copy.py'
Oct 06 13:45:38 compute-0 sudo[164539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:39 compute-0 python3.9[164541]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758337.7635202-576-190472951105606/.source.json _original_basename=.e09m2sp9 follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:39 compute-0 sudo[164539]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:39 compute-0 sudo[164691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqorvfmeixjocjkvokptmbvpanmxrlfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758339.2654335-606-276702597416873/AnsiballZ_file.py'
Oct 06 13:45:39 compute-0 sudo[164691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:39 compute-0 python3.9[164693]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:39 compute-0 sudo[164691]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:40 compute-0 sudo[164843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbpusowbjmclnoyiiunufzsjpgpfczqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758340.1330369-622-203118253608796/AnsiballZ_stat.py'
Oct 06 13:45:40 compute-0 sudo[164843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:40 compute-0 sudo[164843]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:41 compute-0 sudo[164966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmhvzvcxicnpoogdblhefuridbqpdgwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758340.1330369-622-203118253608796/AnsiballZ_copy.py'
Oct 06 13:45:41 compute-0 sudo[164966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:41 compute-0 sudo[164966]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:42 compute-0 sudo[165118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swvzuztvkbhvzjxjfirzszbmvzcbltaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758341.7267177-656-196923457517222/AnsiballZ_container_config_data.py'
Oct 06 13:45:42 compute-0 sudo[165118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:42 compute-0 python3.9[165120]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct 06 13:45:42 compute-0 sudo[165118]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:43 compute-0 sudo[165270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udqdwqnivplcsrspsiccpliktopqugss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758342.73129-674-162435970447551/AnsiballZ_container_config_hash.py'
Oct 06 13:45:43 compute-0 sudo[165270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:43 compute-0 python3.9[165272]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 06 13:45:43 compute-0 sudo[165270]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:44 compute-0 sudo[165422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciehsrhgjrhkwrjyvjuraoavcxurlrmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758343.81409-692-126291261206520/AnsiballZ_podman_container_info.py'
Oct 06 13:45:44 compute-0 sudo[165422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:44 compute-0 python3.9[165424]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 06 13:45:44 compute-0 sudo[165422]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:45 compute-0 sudo[165626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfbzoscgxobjewfsrmgqkoqtwzvafvil ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759758345.283083-718-246186483962089/AnsiballZ_edpm_container_manage.py'
Oct 06 13:45:45 compute-0 sudo[165626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:45 compute-0 podman[165575]: 2025-10-06 13:45:45.95502174 +0000 UTC m=+0.076548004 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 06 13:45:46 compute-0 podman[165574]: 2025-10-06 13:45:46.006187126 +0000 UTC m=+0.126658891 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 06 13:45:46 compute-0 python3[165636]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 06 13:45:46 compute-0 podman[165681]: 2025-10-06 13:45:46.48128465 +0000 UTC m=+0.083512539 container create b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid)
Oct 06 13:45:46 compute-0 podman[165681]: 2025-10-06 13:45:46.442159369 +0000 UTC m=+0.044387318 image pull 0b62d011736892703306395462c684fe0dfe1473b0a9397423133e591c417adb 38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 06 13:45:46 compute-0 python3[165636]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z 38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 06 13:45:46 compute-0 sudo[165626]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:47 compute-0 sudo[165869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugjpugodlnblaowrcrsqrjikdgfsselx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758346.8325863-734-195994168371496/AnsiballZ_stat.py'
Oct 06 13:45:47 compute-0 sudo[165869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:47 compute-0 python3.9[165871]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:45:47 compute-0 sudo[165869]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:47 compute-0 sudo[166023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niwogzhsqljmwahlrpmiinhurkdekuwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758347.6575792-752-272178240173979/AnsiballZ_file.py'
Oct 06 13:45:47 compute-0 sudo[166023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:48 compute-0 python3.9[166025]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:48 compute-0 sudo[166023]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:48 compute-0 sudo[166099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xumwowhtgxbcopplredwkndknoimywza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758347.6575792-752-272178240173979/AnsiballZ_stat.py'
Oct 06 13:45:48 compute-0 sudo[166099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:48 compute-0 python3.9[166101]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:45:48 compute-0 sudo[166099]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:49 compute-0 sudo[166250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suqgvaffpofgjdylzoemqyohrybukjtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758348.811678-752-243264229410328/AnsiballZ_copy.py'
Oct 06 13:45:49 compute-0 sudo[166250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:49 compute-0 python3.9[166252]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759758348.811678-752-243264229410328/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:49 compute-0 sudo[166250]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:49 compute-0 sudo[166326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvudvaawaxirwccrpxfbgqcgjqepxldg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758348.811678-752-243264229410328/AnsiballZ_systemd.py'
Oct 06 13:45:49 compute-0 sudo[166326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:50 compute-0 python3.9[166328]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:45:50 compute-0 systemd[1]: Reloading.
Oct 06 13:45:50 compute-0 systemd-rc-local-generator[166356]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:45:50 compute-0 systemd-sysv-generator[166359]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:45:50 compute-0 sudo[166326]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:50 compute-0 sudo[166437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcqvqrxeasjigbgpljqffcaprwuryjgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758348.811678-752-243264229410328/AnsiballZ_systemd.py'
Oct 06 13:45:50 compute-0 sudo[166437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:51 compute-0 python3.9[166439]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:45:51 compute-0 systemd[1]: Reloading.
Oct 06 13:45:51 compute-0 systemd-rc-local-generator[166466]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:45:51 compute-0 systemd-sysv-generator[166471]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:45:51 compute-0 systemd[1]: Starting iscsid container...
Oct 06 13:45:51 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:45:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce86d8c7ba30d1eebe58ced2a2f6d36f149f9a4b464cfddf24a17d1830d2f40e/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 06 13:45:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce86d8c7ba30d1eebe58ced2a2f6d36f149f9a4b464cfddf24a17d1830d2f40e/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 06 13:45:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce86d8c7ba30d1eebe58ced2a2f6d36f149f9a4b464cfddf24a17d1830d2f40e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 06 13:45:51 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480.
Oct 06 13:45:51 compute-0 podman[166479]: 2025-10-06 13:45:51.672203619 +0000 UTC m=+0.198985728 container init b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:45:51 compute-0 iscsid[166495]: + sudo -E kolla_set_configs
Oct 06 13:45:51 compute-0 podman[166479]: 2025-10-06 13:45:51.693455801 +0000 UTC m=+0.220237910 container start b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4)
Oct 06 13:45:51 compute-0 sudo[166501]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 06 13:45:51 compute-0 podman[166479]: iscsid
Oct 06 13:45:51 compute-0 systemd[1]: Started iscsid container.
Oct 06 13:45:51 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 06 13:45:51 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 06 13:45:51 compute-0 sudo[166437]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:51 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 06 13:45:51 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 06 13:45:51 compute-0 systemd[166516]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 06 13:45:51 compute-0 podman[166502]: 2025-10-06 13:45:51.786888036 +0000 UTC m=+0.071861305 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20250930, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4)
Oct 06 13:45:51 compute-0 systemd[1]: b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480-116ac1cce0f94bac.service: Main process exited, code=exited, status=1/FAILURE
Oct 06 13:45:51 compute-0 systemd[1]: b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480-116ac1cce0f94bac.service: Failed with result 'exit-code'.
Oct 06 13:45:51 compute-0 systemd[166516]: Queued start job for default target Main User Target.
Oct 06 13:45:51 compute-0 systemd[166516]: Created slice User Application Slice.
Oct 06 13:45:51 compute-0 systemd[166516]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 06 13:45:51 compute-0 systemd[166516]: Started Daily Cleanup of User's Temporary Directories.
Oct 06 13:45:51 compute-0 systemd[166516]: Reached target Paths.
Oct 06 13:45:51 compute-0 systemd[166516]: Reached target Timers.
Oct 06 13:45:51 compute-0 systemd[166516]: Starting D-Bus User Message Bus Socket...
Oct 06 13:45:51 compute-0 systemd[166516]: Starting Create User's Volatile Files and Directories...
Oct 06 13:45:51 compute-0 systemd[166516]: Listening on D-Bus User Message Bus Socket.
Oct 06 13:45:51 compute-0 systemd[166516]: Finished Create User's Volatile Files and Directories.
Oct 06 13:45:51 compute-0 systemd[166516]: Reached target Sockets.
Oct 06 13:45:51 compute-0 systemd[166516]: Reached target Basic System.
Oct 06 13:45:51 compute-0 systemd[166516]: Reached target Main User Target.
Oct 06 13:45:51 compute-0 systemd[166516]: Startup finished in 133ms.
Oct 06 13:45:51 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 06 13:45:51 compute-0 systemd[1]: Started Session c3 of User root.
Oct 06 13:45:51 compute-0 sudo[166501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 06 13:45:51 compute-0 iscsid[166495]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 06 13:45:51 compute-0 iscsid[166495]: INFO:__main__:Validating config file
Oct 06 13:45:51 compute-0 iscsid[166495]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 06 13:45:51 compute-0 iscsid[166495]: INFO:__main__:Writing out command to execute
Oct 06 13:45:51 compute-0 sudo[166501]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:51 compute-0 iscsid[166495]: ++ cat /run_command
Oct 06 13:45:51 compute-0 systemd[1]: session-c3.scope: Deactivated successfully.
Oct 06 13:45:51 compute-0 iscsid[166495]: + CMD='/usr/sbin/iscsid -f'
Oct 06 13:45:51 compute-0 iscsid[166495]: + ARGS=
Oct 06 13:45:51 compute-0 iscsid[166495]: + sudo kolla_copy_cacerts
Oct 06 13:45:52 compute-0 sudo[166622]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 06 13:45:52 compute-0 systemd[1]: Started Session c4 of User root.
Oct 06 13:45:52 compute-0 sudo[166622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 06 13:45:52 compute-0 sudo[166622]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:52 compute-0 systemd[1]: session-c4.scope: Deactivated successfully.
Oct 06 13:45:52 compute-0 iscsid[166495]: + [[ ! -n '' ]]
Oct 06 13:45:52 compute-0 iscsid[166495]: + . kolla_extend_start
Oct 06 13:45:52 compute-0 iscsid[166495]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct 06 13:45:52 compute-0 iscsid[166495]: Running command: '/usr/sbin/iscsid -f'
Oct 06 13:45:52 compute-0 iscsid[166495]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct 06 13:45:52 compute-0 iscsid[166495]: + umask 0022
Oct 06 13:45:52 compute-0 iscsid[166495]: + exec /usr/sbin/iscsid -f
Oct 06 13:45:52 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Oct 06 13:45:52 compute-0 python3.9[166700]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:45:52 compute-0 sudo[166850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfydxoohueoxqttmdsilgwvfcolfvgxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758352.621815-826-157260646329279/AnsiballZ_file.py'
Oct 06 13:45:52 compute-0 sudo[166850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:53 compute-0 python3.9[166852]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:45:53 compute-0 sudo[166850]: pam_unix(sudo:session): session closed for user root
Oct 06 13:45:53 compute-0 sudo[167002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfnouczudeejqwcvglrpnkvsfgjkoyoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758353.5865245-848-27374237256618/AnsiballZ_service_facts.py'
Oct 06 13:45:53 compute-0 sudo[167002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:45:54 compute-0 python3.9[167004]: ansible-ansible.builtin.service_facts Invoked
Oct 06 13:45:54 compute-0 network[167021]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 06 13:45:54 compute-0 network[167022]: 'network-scripts' will be removed from distribution in near future.
Oct 06 13:45:54 compute-0 network[167023]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 06 13:45:59 compute-0 sudo[167002]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:00 compute-0 sudo[167295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmsxmqafbohytyriqeblhsdqvmaskdfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758359.7925975-868-5956664801504/AnsiballZ_file.py'
Oct 06 13:46:00 compute-0 sudo[167295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:00 compute-0 python3.9[167297]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 06 13:46:00 compute-0 sudo[167295]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:01 compute-0 sudo[167447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iniyrynwhbebrxuflozcqzswujrhykbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758360.5338135-884-105756920546863/AnsiballZ_modprobe.py'
Oct 06 13:46:01 compute-0 sudo[167447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:01 compute-0 python3.9[167449]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct 06 13:46:01 compute-0 sudo[167447]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:01 compute-0 sudo[167603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktooqcwclgdrpdjcawtrgpnhyqmzlhkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758361.426594-900-196533498805336/AnsiballZ_stat.py'
Oct 06 13:46:01 compute-0 sudo[167603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:01 compute-0 python3.9[167605]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:01 compute-0 sudo[167603]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:02 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 06 13:46:02 compute-0 systemd[166516]: Activating special unit Exit the Session...
Oct 06 13:46:02 compute-0 systemd[166516]: Stopped target Main User Target.
Oct 06 13:46:02 compute-0 systemd[166516]: Stopped target Basic System.
Oct 06 13:46:02 compute-0 systemd[166516]: Stopped target Paths.
Oct 06 13:46:02 compute-0 systemd[166516]: Stopped target Sockets.
Oct 06 13:46:02 compute-0 systemd[166516]: Stopped target Timers.
Oct 06 13:46:02 compute-0 systemd[166516]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 06 13:46:02 compute-0 systemd[166516]: Closed D-Bus User Message Bus Socket.
Oct 06 13:46:02 compute-0 systemd[166516]: Stopped Create User's Volatile Files and Directories.
Oct 06 13:46:02 compute-0 systemd[166516]: Removed slice User Application Slice.
Oct 06 13:46:02 compute-0 systemd[166516]: Reached target Shutdown.
Oct 06 13:46:02 compute-0 systemd[166516]: Finished Exit the Session.
Oct 06 13:46:02 compute-0 systemd[166516]: Reached target Exit the Session.
Oct 06 13:46:02 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 06 13:46:02 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 06 13:46:02 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 06 13:46:02 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 06 13:46:02 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 06 13:46:02 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 06 13:46:02 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 06 13:46:02 compute-0 sudo[167727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eakwtldmkvnyyljqtdifiylnjnsfyfjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758361.426594-900-196533498805336/AnsiballZ_copy.py'
Oct 06 13:46:02 compute-0 sudo[167727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:02 compute-0 python3.9[167729]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758361.426594-900-196533498805336/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:02 compute-0 sudo[167727]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:03 compute-0 sudo[167879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhiwwttodnyxrhdzachwjkxzwlinimps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758362.7761135-932-75614477081557/AnsiballZ_lineinfile.py'
Oct 06 13:46:03 compute-0 sudo[167879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:03 compute-0 python3.9[167881]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:03 compute-0 sudo[167879]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:03 compute-0 sudo[168031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsmghmmtehzexnxmjpzkdlopzrjzhagv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758363.5174716-948-263518237637692/AnsiballZ_systemd.py'
Oct 06 13:46:03 compute-0 sudo[168031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:04 compute-0 python3.9[168033]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:46:04 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 06 13:46:04 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 06 13:46:04 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 06 13:46:04 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 06 13:46:04 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 06 13:46:04 compute-0 sudo[168031]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:04 compute-0 sudo[168187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwtpfzdeuksgazyjswysfhfelfodjkoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758364.5407465-964-241462084555199/AnsiballZ_file.py'
Oct 06 13:46:04 compute-0 sudo[168187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:05 compute-0 python3.9[168189]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:46:05 compute-0 sudo[168187]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:05 compute-0 sudo[168339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysjaagwsyjdlxgwndpfeeufxqsiegwel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758365.3685749-982-232115143697159/AnsiballZ_stat.py'
Oct 06 13:46:05 compute-0 sudo[168339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:05 compute-0 python3.9[168341]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:46:05 compute-0 sudo[168339]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:06 compute-0 sudo[168491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwxsseronxwvrdepqvzvyzxalnfmwxub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758366.193567-1000-224303279396747/AnsiballZ_stat.py'
Oct 06 13:46:06 compute-0 sudo[168491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:06 compute-0 python3.9[168493]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:46:06 compute-0 sudo[168491]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:07 compute-0 sudo[168643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqmruicntnjylrthkitrgfrdkiijcrga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758366.9998152-1016-138529137285306/AnsiballZ_stat.py'
Oct 06 13:46:07 compute-0 sudo[168643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:07 compute-0 python3.9[168645]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:07 compute-0 sudo[168643]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:08 compute-0 sudo[168766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvfjdlhstamiqavukjbqywpjmxixjxpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758366.9998152-1016-138529137285306/AnsiballZ_copy.py'
Oct 06 13:46:08 compute-0 sudo[168766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:08 compute-0 python3.9[168768]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758366.9998152-1016-138529137285306/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:08 compute-0 sudo[168766]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:08 compute-0 sudo[168918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdsdwuteiisfpjmzkyzawcaepgymosgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758368.4243908-1046-133322034713894/AnsiballZ_command.py'
Oct 06 13:46:08 compute-0 sudo[168918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:09 compute-0 python3.9[168920]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:46:09 compute-0 sudo[168918]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:09 compute-0 sudo[169071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejykyrsddlddhegeqtyxbrupflpdvktl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758369.3718948-1062-234452804575755/AnsiballZ_lineinfile.py'
Oct 06 13:46:09 compute-0 sudo[169071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:09 compute-0 python3.9[169073]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:09 compute-0 sudo[169071]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:10 compute-0 sudo[169223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixdwyehkzmgbhsuapzmejjuxxwsrfhsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758370.1394794-1078-221604141788510/AnsiballZ_replace.py'
Oct 06 13:46:10 compute-0 sudo[169223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:10 compute-0 python3.9[169225]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:10 compute-0 sudo[169223]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:46:11.319 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:46:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:46:11.320 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:46:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:46:11.320 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:46:11 compute-0 sudo[169376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nchcsngxfjjdcjthfcxrardxsapfsqan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758371.0979397-1094-188590869743860/AnsiballZ_replace.py'
Oct 06 13:46:11 compute-0 sudo[169376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:11 compute-0 python3.9[169378]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:11 compute-0 sudo[169376]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:12 compute-0 sudo[169528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfuvqgrjstgcvxctpqgtmvmvpsdszskm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758371.893666-1112-127532124414328/AnsiballZ_lineinfile.py'
Oct 06 13:46:12 compute-0 sudo[169528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:12 compute-0 python3.9[169530]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:12 compute-0 sudo[169528]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:12 compute-0 sudo[169680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzadvgyxajcoxowxkolvhnpwdnqfjmln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758372.6325912-1112-67921398940693/AnsiballZ_lineinfile.py'
Oct 06 13:46:12 compute-0 sudo[169680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:13 compute-0 python3.9[169682]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:13 compute-0 sudo[169680]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:13 compute-0 sudo[169832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvjswfqvghbdxxgksdkjpmiqvecrpaqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758373.3500466-1112-26744598063461/AnsiballZ_lineinfile.py'
Oct 06 13:46:13 compute-0 sudo[169832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:13 compute-0 python3.9[169834]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:13 compute-0 sudo[169832]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:14 compute-0 sudo[169984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbqgkewkdyezdjqzqqmrqguhxmewogru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758374.0407917-1112-206803903058193/AnsiballZ_lineinfile.py'
Oct 06 13:46:14 compute-0 sudo[169984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:14 compute-0 python3.9[169986]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:14 compute-0 sudo[169984]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:15 compute-0 sudo[170136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knhazoymdmvadyjuvwzmvohazhgtfttu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758374.8139873-1170-19464673179207/AnsiballZ_stat.py'
Oct 06 13:46:15 compute-0 sudo[170136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:15 compute-0 python3.9[170138]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:46:15 compute-0 sudo[170136]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:15 compute-0 sudo[170290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfyqiynmlwfpxuazenmecfenkykrtzfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758375.5563078-1186-246792617512880/AnsiballZ_file.py'
Oct 06 13:46:15 compute-0 sudo[170290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:16 compute-0 python3.9[170292]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:16 compute-0 sudo[170290]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:16 compute-0 podman[170294]: 2025-10-06 13:46:16.245435796 +0000 UTC m=+0.089761355 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 06 13:46:16 compute-0 podman[170293]: 2025-10-06 13:46:16.293803548 +0000 UTC m=+0.138174798 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:46:16 compute-0 sudo[170484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiepfxyqqwcjkkxdhhwwaykrkpvotbub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758376.4001765-1204-5582874177242/AnsiballZ_file.py'
Oct 06 13:46:16 compute-0 sudo[170484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:16 compute-0 python3.9[170486]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:46:16 compute-0 sudo[170484]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:17 compute-0 sudo[170636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkmeryccosgvywyjhfbodfbwqfwsocvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758377.137186-1220-197015007679907/AnsiballZ_stat.py'
Oct 06 13:46:17 compute-0 sudo[170636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:17 compute-0 python3.9[170638]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:17 compute-0 sudo[170636]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:18 compute-0 sudo[170714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ookrgyiwldftuafntyymvmxwroihodvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758377.137186-1220-197015007679907/AnsiballZ_file.py'
Oct 06 13:46:18 compute-0 sudo[170714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:18 compute-0 python3.9[170716]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:46:18 compute-0 sudo[170714]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:18 compute-0 sudo[170866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsmrqjfzsrverdkpejzvloedpfxqeowv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758378.4872012-1220-279136616796068/AnsiballZ_stat.py'
Oct 06 13:46:18 compute-0 sudo[170866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:19 compute-0 python3.9[170868]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:19 compute-0 sudo[170866]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:19 compute-0 sudo[170944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfcrkrbydbgicyhbftcqhmnjrmwcisqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758378.4872012-1220-279136616796068/AnsiballZ_file.py'
Oct 06 13:46:19 compute-0 sudo[170944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:19 compute-0 python3.9[170946]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:46:19 compute-0 sudo[170944]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:20 compute-0 sudo[171096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xygdumbdnmgxvkfxsrahdphvibjpqtwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758379.7492037-1266-200607977391252/AnsiballZ_file.py'
Oct 06 13:46:20 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct 06 13:46:20 compute-0 sudo[171096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:20 compute-0 python3.9[171099]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:20 compute-0 sudo[171096]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:20 compute-0 sudo[171249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysrpqrbosxeekbwxkylzxizmdayujmnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758380.505287-1282-96971707124417/AnsiballZ_stat.py'
Oct 06 13:46:20 compute-0 sudo[171249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:21 compute-0 python3.9[171251]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:21 compute-0 sudo[171249]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:21 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 06 13:46:21 compute-0 sudo[171328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkeblvlkyeeludbvclrrjdvwbqdpebcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758380.505287-1282-96971707124417/AnsiballZ_file.py'
Oct 06 13:46:21 compute-0 sudo[171328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:21 compute-0 python3.9[171330]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:21 compute-0 sudo[171328]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:22 compute-0 sudo[171495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvgvaplqjdjdezixgvudghndtozlcpyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758381.7926626-1306-40429377139712/AnsiballZ_stat.py'
Oct 06 13:46:22 compute-0 podman[171454]: 2025-10-06 13:46:22.157315681 +0000 UTC m=+0.064690696 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Oct 06 13:46:22 compute-0 sudo[171495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:22 compute-0 python3.9[171502]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:22 compute-0 sudo[171495]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:22 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Oct 06 13:46:22 compute-0 sudo[171579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtnnpmakunzngmplntfxqkhzxihrjssj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758381.7926626-1306-40429377139712/AnsiballZ_file.py'
Oct 06 13:46:22 compute-0 sudo[171579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:22 compute-0 python3.9[171581]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:22 compute-0 sudo[171579]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:23 compute-0 sudo[171731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uibhxaectrovfasxvhanvpbfmmkatzvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758383.038965-1330-123225432521437/AnsiballZ_systemd.py'
Oct 06 13:46:23 compute-0 sudo[171731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:23 compute-0 python3.9[171733]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:46:23 compute-0 systemd[1]: Reloading.
Oct 06 13:46:23 compute-0 systemd-rc-local-generator[171761]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:46:23 compute-0 systemd-sysv-generator[171766]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:46:23 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 06 13:46:24 compute-0 sudo[171731]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:24 compute-0 sudo[171923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkzxrgqwsxtpdcfarinjvulsowvtxvdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758384.209968-1346-229213264671238/AnsiballZ_stat.py'
Oct 06 13:46:24 compute-0 sudo[171923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:24 compute-0 python3.9[171925]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:24 compute-0 sudo[171923]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:25 compute-0 sudo[172001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trunvklanxouvydrmlfwsphzeipgpmxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758384.209968-1346-229213264671238/AnsiballZ_file.py'
Oct 06 13:46:25 compute-0 sudo[172001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:25 compute-0 python3.9[172003]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:25 compute-0 sudo[172001]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:25 compute-0 sudo[172153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaiapfcmzwoktoxfrohxwxviabnsrmgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758385.482176-1370-187344374047115/AnsiballZ_stat.py'
Oct 06 13:46:25 compute-0 sudo[172153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:25 compute-0 python3.9[172155]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:26 compute-0 sudo[172153]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:26 compute-0 sudo[172231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izehnwsbcffhhumeeykeletcssqbioxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758385.482176-1370-187344374047115/AnsiballZ_file.py'
Oct 06 13:46:26 compute-0 sudo[172231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:26 compute-0 python3.9[172233]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:26 compute-0 sudo[172231]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:26 compute-0 sudo[172383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vifophqokqvtqyjvtofhjmddrohhvniu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758386.6365001-1394-137812029047732/AnsiballZ_systemd.py'
Oct 06 13:46:26 compute-0 sudo[172383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:27 compute-0 python3.9[172385]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:46:27 compute-0 systemd[1]: Reloading.
Oct 06 13:46:27 compute-0 systemd-rc-local-generator[172411]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:46:27 compute-0 systemd-sysv-generator[172415]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:46:27 compute-0 systemd[1]: Starting Create netns directory...
Oct 06 13:46:27 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 06 13:46:27 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 06 13:46:27 compute-0 systemd[1]: Finished Create netns directory.
Oct 06 13:46:27 compute-0 sudo[172383]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:28 compute-0 sudo[172575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dztpjkgyhxkwubhczbsnzyzvaptoprqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758387.937981-1414-217477647813963/AnsiballZ_file.py'
Oct 06 13:46:28 compute-0 sudo[172575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:28 compute-0 python3.9[172577]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:46:28 compute-0 sudo[172575]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:29 compute-0 sudo[172727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vawhjxyrodqpzzzjzerhtlayxqmvjcsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758388.7336712-1430-103755188273251/AnsiballZ_stat.py'
Oct 06 13:46:29 compute-0 sudo[172727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:29 compute-0 python3.9[172729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:29 compute-0 sudo[172727]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:29 compute-0 sudo[172850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiyawgtncgosjzhyxzqrrimvqjnktcaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758388.7336712-1430-103755188273251/AnsiballZ_copy.py'
Oct 06 13:46:29 compute-0 sudo[172850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:29 compute-0 python3.9[172852]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758388.7336712-1430-103755188273251/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:46:29 compute-0 sudo[172850]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:30 compute-0 sudo[173002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxlgalgyjtrilmdquybbgylroxyswzne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758390.3702636-1464-55936730229427/AnsiballZ_file.py'
Oct 06 13:46:30 compute-0 sudo[173002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:30 compute-0 python3.9[173004]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:46:30 compute-0 sudo[173002]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:31 compute-0 sudo[173154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kszdgguelpqmvplamkwqrihplkxukcve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758391.1887448-1480-242291345962726/AnsiballZ_stat.py'
Oct 06 13:46:31 compute-0 sudo[173154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:31 compute-0 python3.9[173156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:31 compute-0 sudo[173154]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:32 compute-0 sudo[173277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tctdvrncnfgixlpekmzkspntewenujni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758391.1887448-1480-242291345962726/AnsiballZ_copy.py'
Oct 06 13:46:32 compute-0 sudo[173277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:32 compute-0 python3.9[173279]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758391.1887448-1480-242291345962726/.source.json _original_basename=.88nvyt9d follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:32 compute-0 sudo[173277]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:32 compute-0 sudo[173429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpjssfilvjtpbvcjxvxzxgnprdsdwjko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758392.6062055-1510-163261740498207/AnsiballZ_file.py'
Oct 06 13:46:32 compute-0 sudo[173429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:33 compute-0 python3.9[173431]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:33 compute-0 sudo[173429]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:33 compute-0 sudo[173581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjsryyarqlxorgjnmcibuazwynmqjjll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758393.3974152-1526-72605973182986/AnsiballZ_stat.py'
Oct 06 13:46:33 compute-0 sudo[173581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:33 compute-0 sudo[173581]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:34 compute-0 sudo[173704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwvkgpjdhbzcsddkbgzebawhktnmcuta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758393.3974152-1526-72605973182986/AnsiballZ_copy.py'
Oct 06 13:46:34 compute-0 sudo[173704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:34 compute-0 sudo[173704]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:35 compute-0 sudo[173856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dheaquhuhbddiqyigzjpmrdhnvcgvnsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758394.8527975-1560-233758888074786/AnsiballZ_container_config_data.py'
Oct 06 13:46:35 compute-0 sudo[173856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:35 compute-0 python3.9[173858]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct 06 13:46:35 compute-0 sudo[173856]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:35 compute-0 sudo[174008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aflekjyfguxqzkdvjnxfgglbnlvwnkhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758395.671954-1578-186411645263362/AnsiballZ_container_config_hash.py'
Oct 06 13:46:35 compute-0 sudo[174008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:36 compute-0 python3.9[174010]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 06 13:46:36 compute-0 sudo[174008]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:36 compute-0 sudo[174160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taugrttgbqcxdkkxfbvjjzpaemodpemo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758396.4788318-1596-52342799142117/AnsiballZ_podman_container_info.py'
Oct 06 13:46:36 compute-0 sudo[174160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:37 compute-0 python3.9[174162]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 06 13:46:37 compute-0 sudo[174160]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:38 compute-0 sudo[174339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzxehdgtokoyzsqzvfydehpokrebmmqf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759758397.9919772-1622-43793322120908/AnsiballZ_edpm_container_manage.py'
Oct 06 13:46:38 compute-0 sudo[174339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:38 compute-0 python3[174341]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 06 13:46:38 compute-0 podman[174378]: 2025-10-06 13:46:38.923741078 +0000 UTC m=+0.059667319 container create afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Oct 06 13:46:38 compute-0 podman[174378]: 2025-10-06 13:46:38.896492659 +0000 UTC m=+0.032418910 image pull a64e163f15f11e74249854aa8fb3596c33858cb805250d3c9483585fd4a94bdb 38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Oct 06 13:46:38 compute-0 python3[174341]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Oct 06 13:46:39 compute-0 sudo[174339]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:39 compute-0 sudo[174566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lesdkzyztwopidbesckmyjdhoubgorxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758399.3210099-1638-224336000288409/AnsiballZ_stat.py'
Oct 06 13:46:39 compute-0 sudo[174566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:39 compute-0 python3.9[174568]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:46:39 compute-0 sudo[174566]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:40 compute-0 sudo[174720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsinykncwebdhskvljzkocomqwlduhqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758400.237059-1656-105861755512986/AnsiballZ_file.py'
Oct 06 13:46:40 compute-0 sudo[174720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:40 compute-0 python3.9[174722]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:40 compute-0 sudo[174720]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:41 compute-0 sudo[174796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-podvcitaidrbjpdzlyqdgqdlufxrjrfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758400.237059-1656-105861755512986/AnsiballZ_stat.py'
Oct 06 13:46:41 compute-0 sudo[174796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:41 compute-0 python3.9[174798]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:46:41 compute-0 sudo[174796]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:41 compute-0 sudo[174947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llyoyvvymcwqqcrecmcppbfxjfvthfxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758401.3786716-1656-52848788136630/AnsiballZ_copy.py'
Oct 06 13:46:41 compute-0 sudo[174947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:42 compute-0 python3.9[174949]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759758401.3786716-1656-52848788136630/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:42 compute-0 sudo[174947]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:42 compute-0 sudo[175023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljnytuwmclkvzbgaihzfvetuyuhrrgxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758401.3786716-1656-52848788136630/AnsiballZ_systemd.py'
Oct 06 13:46:42 compute-0 sudo[175023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:42 compute-0 python3.9[175025]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:46:42 compute-0 systemd[1]: Reloading.
Oct 06 13:46:42 compute-0 systemd-rc-local-generator[175050]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:46:42 compute-0 systemd-sysv-generator[175053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:46:43 compute-0 sudo[175023]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:43 compute-0 sudo[175134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sytzshyqcttghomiewzgfoqiukeaauco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758401.3786716-1656-52848788136630/AnsiballZ_systemd.py'
Oct 06 13:46:43 compute-0 sudo[175134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:43 compute-0 python3.9[175136]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:46:43 compute-0 systemd[1]: Reloading.
Oct 06 13:46:43 compute-0 systemd-rc-local-generator[175166]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:46:43 compute-0 systemd-sysv-generator[175171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:46:44 compute-0 systemd[1]: Starting multipathd container...
Oct 06 13:46:44 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:46:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64601b82bc41551f1153befb7003792128d584cb2bea56becfee754941cca087/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 06 13:46:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64601b82bc41551f1153befb7003792128d584cb2bea56becfee754941cca087/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 06 13:46:44 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9.
Oct 06 13:46:44 compute-0 podman[175176]: 2025-10-06 13:46:44.283642863 +0000 UTC m=+0.174753240 container init afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:46:44 compute-0 multipathd[175191]: + sudo -E kolla_set_configs
Oct 06 13:46:44 compute-0 podman[175176]: 2025-10-06 13:46:44.321087088 +0000 UTC m=+0.212197465 container start afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 13:46:44 compute-0 sudo[175197]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 06 13:46:44 compute-0 sudo[175197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 06 13:46:44 compute-0 podman[175176]: multipathd
Oct 06 13:46:44 compute-0 systemd[1]: Started multipathd container.
Oct 06 13:46:44 compute-0 sudo[175134]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:44 compute-0 multipathd[175191]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 06 13:46:44 compute-0 multipathd[175191]: INFO:__main__:Validating config file
Oct 06 13:46:44 compute-0 multipathd[175191]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 06 13:46:44 compute-0 multipathd[175191]: INFO:__main__:Writing out command to execute
Oct 06 13:46:44 compute-0 sudo[175197]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:44 compute-0 multipathd[175191]: ++ cat /run_command
Oct 06 13:46:44 compute-0 multipathd[175191]: + CMD='/usr/sbin/multipathd -d'
Oct 06 13:46:44 compute-0 multipathd[175191]: + ARGS=
Oct 06 13:46:44 compute-0 multipathd[175191]: + sudo kolla_copy_cacerts
Oct 06 13:46:44 compute-0 podman[175198]: 2025-10-06 13:46:44.437547436 +0000 UTC m=+0.096037615 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 13:46:44 compute-0 sudo[175225]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 06 13:46:44 compute-0 sudo[175225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 06 13:46:44 compute-0 systemd[1]: afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9-5d7fdeea1d298c45.service: Main process exited, code=exited, status=1/FAILURE
Oct 06 13:46:44 compute-0 systemd[1]: afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9-5d7fdeea1d298c45.service: Failed with result 'exit-code'.
Oct 06 13:46:44 compute-0 sudo[175225]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:44 compute-0 multipathd[175191]: + [[ ! -n '' ]]
Oct 06 13:46:44 compute-0 multipathd[175191]: + . kolla_extend_start
Oct 06 13:46:44 compute-0 multipathd[175191]: Running command: '/usr/sbin/multipathd -d'
Oct 06 13:46:44 compute-0 multipathd[175191]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 06 13:46:44 compute-0 multipathd[175191]: + umask 0022
Oct 06 13:46:44 compute-0 multipathd[175191]: + exec /usr/sbin/multipathd -d
Oct 06 13:46:44 compute-0 multipathd[175191]: 2866.172542 | multipathd v0.9.9: start up
Oct 06 13:46:44 compute-0 multipathd[175191]: 2866.183455 | reconfigure: setting up paths and maps
Oct 06 13:46:44 compute-0 multipathd[175191]: 2866.185252 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Oct 06 13:46:44 compute-0 multipathd[175191]: 2866.186840 | updated bindings file /etc/multipath/bindings
Oct 06 13:46:45 compute-0 python3.9[175381]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:46:45 compute-0 sudo[175533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qetazvencsnlaskxmtjnuzbmrbepbfjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758405.3384228-1728-224290588559799/AnsiballZ_command.py'
Oct 06 13:46:45 compute-0 sudo[175533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:45 compute-0 python3.9[175535]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:46:45 compute-0 sudo[175533]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:46 compute-0 sudo[175725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fooqgsqqejybesrwxemsrttdzbdnqfxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758406.3296106-1744-56550059892139/AnsiballZ_systemd.py'
Oct 06 13:46:46 compute-0 sudo[175725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:46 compute-0 podman[175669]: 2025-10-06 13:46:46.757212413 +0000 UTC m=+0.111774222 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:46:46 compute-0 podman[175670]: 2025-10-06 13:46:46.75672239 +0000 UTC m=+0.101005190 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:46:46 compute-0 python3.9[175737]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:46:47 compute-0 systemd[1]: Stopping multipathd container...
Oct 06 13:46:47 compute-0 multipathd[175191]: 2868.829182 | multipathd: shut down
Oct 06 13:46:47 compute-0 systemd[1]: libpod-afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9.scope: Deactivated successfully.
Oct 06 13:46:47 compute-0 podman[175746]: 2025-10-06 13:46:47.160226562 +0000 UTC m=+0.087527614 container died afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd)
Oct 06 13:46:47 compute-0 systemd[1]: afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9-5d7fdeea1d298c45.timer: Deactivated successfully.
Oct 06 13:46:47 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9.
Oct 06 13:46:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9-userdata-shm.mount: Deactivated successfully.
Oct 06 13:46:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-64601b82bc41551f1153befb7003792128d584cb2bea56becfee754941cca087-merged.mount: Deactivated successfully.
Oct 06 13:46:47 compute-0 podman[175746]: 2025-10-06 13:46:47.219818688 +0000 UTC m=+0.147119780 container cleanup afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct 06 13:46:47 compute-0 podman[175746]: multipathd
Oct 06 13:46:47 compute-0 podman[175771]: multipathd
Oct 06 13:46:47 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct 06 13:46:47 compute-0 systemd[1]: Stopped multipathd container.
Oct 06 13:46:47 compute-0 systemd[1]: Starting multipathd container...
Oct 06 13:46:47 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:46:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64601b82bc41551f1153befb7003792128d584cb2bea56becfee754941cca087/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 06 13:46:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64601b82bc41551f1153befb7003792128d584cb2bea56becfee754941cca087/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 06 13:46:47 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9.
Oct 06 13:46:47 compute-0 podman[175784]: 2025-10-06 13:46:47.46213157 +0000 UTC m=+0.139242297 container init afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 13:46:47 compute-0 multipathd[175800]: + sudo -E kolla_set_configs
Oct 06 13:46:47 compute-0 podman[175784]: 2025-10-06 13:46:47.500641844 +0000 UTC m=+0.177752481 container start afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 13:46:47 compute-0 sudo[175806]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 06 13:46:47 compute-0 podman[175784]: multipathd
Oct 06 13:46:47 compute-0 sudo[175806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 06 13:46:47 compute-0 systemd[1]: Started multipathd container.
Oct 06 13:46:47 compute-0 sudo[175725]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:47 compute-0 multipathd[175800]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 06 13:46:47 compute-0 multipathd[175800]: INFO:__main__:Validating config file
Oct 06 13:46:47 compute-0 multipathd[175800]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 06 13:46:47 compute-0 multipathd[175800]: INFO:__main__:Writing out command to execute
Oct 06 13:46:47 compute-0 sudo[175806]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:47 compute-0 multipathd[175800]: ++ cat /run_command
Oct 06 13:46:47 compute-0 multipathd[175800]: + CMD='/usr/sbin/multipathd -d'
Oct 06 13:46:47 compute-0 multipathd[175800]: + ARGS=
Oct 06 13:46:47 compute-0 multipathd[175800]: + sudo kolla_copy_cacerts
Oct 06 13:46:47 compute-0 sudo[175834]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 06 13:46:47 compute-0 podman[175807]: 2025-10-06 13:46:47.600110632 +0000 UTC m=+0.081831090 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, tcib_build_tag=watcher_latest)
Oct 06 13:46:47 compute-0 sudo[175834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 06 13:46:47 compute-0 sudo[175834]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:47 compute-0 systemd[1]: afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9-3a0d8729b47d6f17.service: Main process exited, code=exited, status=1/FAILURE
Oct 06 13:46:47 compute-0 systemd[1]: afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9-3a0d8729b47d6f17.service: Failed with result 'exit-code'.
Oct 06 13:46:47 compute-0 multipathd[175800]: + [[ ! -n '' ]]
Oct 06 13:46:47 compute-0 multipathd[175800]: + . kolla_extend_start
Oct 06 13:46:47 compute-0 multipathd[175800]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 06 13:46:47 compute-0 multipathd[175800]: Running command: '/usr/sbin/multipathd -d'
Oct 06 13:46:47 compute-0 multipathd[175800]: + umask 0022
Oct 06 13:46:47 compute-0 multipathd[175800]: + exec /usr/sbin/multipathd -d
Oct 06 13:46:47 compute-0 multipathd[175800]: 2869.330077 | multipathd v0.9.9: start up
Oct 06 13:46:47 compute-0 multipathd[175800]: 2869.338913 | reconfigure: setting up paths and maps
Oct 06 13:46:48 compute-0 sudo[175989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grfakjwmbthfavxrpeflahmneeeldbgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758407.757383-1760-119298568030948/AnsiballZ_file.py'
Oct 06 13:46:48 compute-0 sudo[175989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:48 compute-0 python3.9[175991]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:48 compute-0 sudo[175989]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:49 compute-0 sudo[176141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmkjkkqioviodkqabayqmivjqstjvgqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758408.7260854-1784-80745428813314/AnsiballZ_file.py'
Oct 06 13:46:49 compute-0 sudo[176141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:49 compute-0 python3.9[176143]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 06 13:46:49 compute-0 sudo[176141]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:49 compute-0 sudo[176293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujkgdhtqvnfsycltkokslwgfdiobijfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758409.4765391-1800-115228202432760/AnsiballZ_modprobe.py'
Oct 06 13:46:49 compute-0 sudo[176293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:50 compute-0 python3.9[176295]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct 06 13:46:50 compute-0 kernel: Key type psk registered
Oct 06 13:46:50 compute-0 sudo[176293]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:50 compute-0 sudo[176459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxwucyxorwnazlogxnhgigdqoyonnuxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758410.3523438-1816-71814822667744/AnsiballZ_stat.py'
Oct 06 13:46:50 compute-0 sudo[176459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:50 compute-0 python3.9[176461]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:46:50 compute-0 sudo[176459]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:51 compute-0 sudo[176582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgtzmegofhgwuwyeulmdqbsupenkzmud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758410.3523438-1816-71814822667744/AnsiballZ_copy.py'
Oct 06 13:46:51 compute-0 sudo[176582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:51 compute-0 python3.9[176584]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758410.3523438-1816-71814822667744/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:51 compute-0 sudo[176582]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:52 compute-0 sudo[176734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwdswrwwwyorwbtkayglcvfxmpgalggx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758411.8232377-1848-131492060521795/AnsiballZ_lineinfile.py'
Oct 06 13:46:52 compute-0 sudo[176734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:52 compute-0 python3.9[176736]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:46:52 compute-0 sudo[176734]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:52 compute-0 sudo[176897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iguqwbhrnodfmttsnajzxpwpjufrsidd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758412.5693512-1864-174140758733172/AnsiballZ_systemd.py'
Oct 06 13:46:52 compute-0 sudo[176897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:52 compute-0 podman[176860]: 2025-10-06 13:46:52.991009966 +0000 UTC m=+0.109069518 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 06 13:46:53 compute-0 python3.9[176906]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:46:53 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 06 13:46:53 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 06 13:46:53 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 06 13:46:53 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 06 13:46:53 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 06 13:46:53 compute-0 sudo[176897]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:53 compute-0 sudo[177060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsjykrjparbnzizskfcmvsyzorzbmatu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758413.5402765-1880-167921136492193/AnsiballZ_setup.py'
Oct 06 13:46:53 compute-0 sudo[177060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:54 compute-0 python3.9[177062]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 06 13:46:54 compute-0 sudo[177060]: pam_unix(sudo:session): session closed for user root
Oct 06 13:46:54 compute-0 sudo[177144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njxoqbvmvkpaitzuymkxemdrgrmjvamt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758413.5402765-1880-167921136492193/AnsiballZ_dnf.py'
Oct 06 13:46:54 compute-0 sudo[177144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:46:55 compute-0 python3.9[177146]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 06 13:47:01 compute-0 systemd[1]: Reloading.
Oct 06 13:47:01 compute-0 systemd-rc-local-generator[177177]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:47:01 compute-0 systemd-sysv-generator[177181]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:47:01 compute-0 systemd[1]: Reloading.
Oct 06 13:47:01 compute-0 systemd-rc-local-generator[177212]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:47:01 compute-0 systemd-sysv-generator[177215]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:47:01 compute-0 systemd-logind[789]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 06 13:47:02 compute-0 systemd-logind[789]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 06 13:47:02 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 06 13:47:02 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 06 13:47:02 compute-0 systemd[1]: Reloading.
Oct 06 13:47:02 compute-0 systemd-rc-local-generator[177307]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:47:02 compute-0 systemd-sysv-generator[177310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:47:02 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 06 13:47:03 compute-0 sudo[177144]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:03 compute-0 sudo[178593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcqoqnchwxtpncbzlrddnqstbfxavryy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758423.3622031-1904-153257959622573/AnsiballZ_file.py'
Oct 06 13:47:03 compute-0 sudo[178593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 06 13:47:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 06 13:47:03 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.662s CPU time.
Oct 06 13:47:03 compute-0 systemd[1]: run-re3c4134a438e443cb6975018e91225d2.service: Deactivated successfully.
Oct 06 13:47:03 compute-0 python3.9[178595]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:03 compute-0 sudo[178593]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:04 compute-0 python3.9[178746]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:47:05 compute-0 sudo[178900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shewzgijiukpccjyxfvsvpmmmdgikjys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758425.200753-1939-268001356452104/AnsiballZ_file.py'
Oct 06 13:47:05 compute-0 sudo[178900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:05 compute-0 python3.9[178902]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:05 compute-0 sudo[178900]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:06 compute-0 sudo[179052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsvbwruaidzwvrxkxvmrheodyralchqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758426.212038-1961-170545087516484/AnsiballZ_systemd_service.py'
Oct 06 13:47:06 compute-0 sudo[179052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:07 compute-0 python3.9[179054]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:47:07 compute-0 systemd[1]: Reloading.
Oct 06 13:47:07 compute-0 systemd-rc-local-generator[179085]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:47:07 compute-0 systemd-sysv-generator[179088]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:47:07 compute-0 sudo[179052]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:08 compute-0 python3.9[179240]: ansible-ansible.builtin.service_facts Invoked
Oct 06 13:47:08 compute-0 network[179257]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 06 13:47:08 compute-0 network[179258]: 'network-scripts' will be removed from distribution in near future.
Oct 06 13:47:08 compute-0 network[179259]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 06 13:47:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:47:11.322 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:47:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:47:11.324 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:47:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:47:11.324 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:47:13 compute-0 sudo[179535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blsguqajkxcfdsqtmofsiancsgkjfcmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758433.1638582-1999-210593853428588/AnsiballZ_systemd_service.py'
Oct 06 13:47:13 compute-0 sudo[179535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:13 compute-0 python3.9[179537]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:47:13 compute-0 sudo[179535]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:14 compute-0 sudo[179688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghtduncrnraaxffgqjjqncjegjeugiwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758434.0380588-1999-43619300026740/AnsiballZ_systemd_service.py'
Oct 06 13:47:14 compute-0 sudo[179688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:14 compute-0 python3.9[179690]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:47:14 compute-0 sudo[179688]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:15 compute-0 sudo[179841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aifymnprzweujxdhkdpjziadahuloplw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758434.93004-1999-169100817330197/AnsiballZ_systemd_service.py'
Oct 06 13:47:15 compute-0 sudo[179841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:15 compute-0 python3.9[179843]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:47:15 compute-0 sudo[179841]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:16 compute-0 sudo[179994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqecvjffajxqzuweevwilkfbwpvajhty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758435.782533-1999-246257600845492/AnsiballZ_systemd_service.py'
Oct 06 13:47:16 compute-0 sudo[179994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:16 compute-0 python3.9[179996]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:47:16 compute-0 sudo[179994]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:16 compute-0 sudo[180185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkvkweugcfoxhrxzsfwbgkytevsfcsmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758436.5677166-1999-47309604136284/AnsiballZ_systemd_service.py'
Oct 06 13:47:16 compute-0 sudo[180185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:16 compute-0 podman[180122]: 2025-10-06 13:47:16.917920558 +0000 UTC m=+0.063171266 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 06 13:47:16 compute-0 podman[180121]: 2025-10-06 13:47:16.949259659 +0000 UTC m=+0.098932227 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 06 13:47:17 compute-0 python3.9[180189]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:47:17 compute-0 sudo[180185]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:17 compute-0 sudo[180345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umqqqdznpawtujpwyedfxgcrcrqmsyhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758437.379562-1999-240051267100624/AnsiballZ_systemd_service.py'
Oct 06 13:47:17 compute-0 sudo[180345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:17 compute-0 podman[180347]: 2025-10-06 13:47:17.767516031 +0000 UTC m=+0.064259665 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:47:18 compute-0 python3.9[180348]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:47:18 compute-0 sudo[180345]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:18 compute-0 sudo[180518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wonsgdgnpnozynuwzxjgdpyrfkwzdtke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758438.2340443-1999-254380822223857/AnsiballZ_systemd_service.py'
Oct 06 13:47:18 compute-0 sudo[180518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:18 compute-0 python3.9[180520]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:47:18 compute-0 sudo[180518]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:19 compute-0 sudo[180671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjtaginrioinitsrjsgfqlxrtpxfllow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758439.0995493-1999-88180472692437/AnsiballZ_systemd_service.py'
Oct 06 13:47:19 compute-0 sudo[180671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:19 compute-0 python3.9[180673]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:47:19 compute-0 sudo[180671]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:20 compute-0 sudo[180824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-expnkhjyltxsiweutdyyyekummxoyhkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758440.200644-2117-186425552305741/AnsiballZ_file.py'
Oct 06 13:47:20 compute-0 sudo[180824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:20 compute-0 python3.9[180826]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:20 compute-0 sudo[180824]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:21 compute-0 sudo[180976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iahlpoehtrebloryuvwhokmgmfbwnnkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758440.9704995-2117-39969993137821/AnsiballZ_file.py'
Oct 06 13:47:21 compute-0 sudo[180976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:21 compute-0 python3.9[180978]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:21 compute-0 sudo[180976]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:21 compute-0 sudo[181128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfxiktdwpzkdthcdosodvgzismfnjxvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758441.6870553-2117-65031034778360/AnsiballZ_file.py'
Oct 06 13:47:21 compute-0 sudo[181128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:22 compute-0 python3.9[181130]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:22 compute-0 sudo[181128]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:22 compute-0 sudo[181280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eckelydckpartxoeuzvzqrsjynizybec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758442.3213532-2117-3924183725588/AnsiballZ_file.py'
Oct 06 13:47:22 compute-0 sudo[181280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:22 compute-0 python3.9[181282]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:22 compute-0 sudo[181280]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:23 compute-0 podman[181385]: 2025-10-06 13:47:23.227442325 +0000 UTC m=+0.080511686 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 13:47:23 compute-0 sudo[181453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkmhvdegqbhjvbvtatswzlfrhmaucgll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758442.931929-2117-201793257741515/AnsiballZ_file.py'
Oct 06 13:47:23 compute-0 sudo[181453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:23 compute-0 python3.9[181455]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:23 compute-0 sudo[181453]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:23 compute-0 sudo[181605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucecdijachtfnrnowhypfzthhwsrouso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758443.6063194-2117-168077015546555/AnsiballZ_file.py'
Oct 06 13:47:23 compute-0 sudo[181605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:24 compute-0 python3.9[181607]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:24 compute-0 sudo[181605]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:24 compute-0 sudo[181757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoaxosxlrbllatrpkfsxjajsoqyuhmxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758444.3029892-2117-209703977202087/AnsiballZ_file.py'
Oct 06 13:47:24 compute-0 sudo[181757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:24 compute-0 python3.9[181759]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:24 compute-0 sudo[181757]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:25 compute-0 sudo[181909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urieytiusukrpwovvbpenbacdwiemmxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758444.9684813-2117-73860728981469/AnsiballZ_file.py'
Oct 06 13:47:25 compute-0 sudo[181909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:25 compute-0 python3.9[181911]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:25 compute-0 sudo[181909]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:26 compute-0 sudo[182061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbfeftlnczgnrxgqcfxmlduzcpsutvrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758445.7706232-2231-67978047968979/AnsiballZ_file.py'
Oct 06 13:47:26 compute-0 sudo[182061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:26 compute-0 python3.9[182063]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:26 compute-0 sudo[182061]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:26 compute-0 sudo[182213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frwzdefftxstooiroriwrqlvbsfukfbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758446.4473104-2231-40764230245085/AnsiballZ_file.py'
Oct 06 13:47:26 compute-0 sudo[182213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:27 compute-0 python3.9[182215]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:27 compute-0 sudo[182213]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:27 compute-0 sudo[182365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhjhuhndvwfmxnmsgkevxdehbntjhcxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758447.2022257-2231-121298764242190/AnsiballZ_file.py'
Oct 06 13:47:27 compute-0 sudo[182365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:27 compute-0 python3.9[182367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:27 compute-0 sudo[182365]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:28 compute-0 sudo[182517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccrbvvtlthnrpggqiiqnnmyjzjfjglzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758447.8861306-2231-235881796476852/AnsiballZ_file.py'
Oct 06 13:47:28 compute-0 sudo[182517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:28 compute-0 python3.9[182519]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:28 compute-0 sudo[182517]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:28 compute-0 sudo[182669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zepnvgnwaorrivirvtzlqhowbkyjpbvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758448.5685637-2231-127584001215511/AnsiballZ_file.py'
Oct 06 13:47:28 compute-0 sudo[182669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:29 compute-0 python3.9[182671]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:29 compute-0 sudo[182669]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:29 compute-0 sudo[182821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwmprfdbkgavxsmhajvwrfzmcavbudpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758449.2747908-2231-44274562075648/AnsiballZ_file.py'
Oct 06 13:47:29 compute-0 sudo[182821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:29 compute-0 python3.9[182823]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:29 compute-0 sudo[182821]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:30 compute-0 sudo[182973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sovuzbuljnwzohkcvdxidzuukemedzrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758449.998174-2231-3649279456256/AnsiballZ_file.py'
Oct 06 13:47:30 compute-0 sudo[182973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:30 compute-0 python3.9[182975]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:30 compute-0 sudo[182973]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:31 compute-0 sudo[183125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yolzcacvlczdyglcueeuqisuqtxgcxij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758450.704091-2231-71022521482205/AnsiballZ_file.py'
Oct 06 13:47:31 compute-0 sudo[183125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:31 compute-0 python3.9[183127]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:47:31 compute-0 sudo[183125]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:31 compute-0 sudo[183277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-numewqguxtwirgikvdmucokvqdmnrwkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758451.510814-2347-237714769695453/AnsiballZ_command.py'
Oct 06 13:47:31 compute-0 sudo[183277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:31 compute-0 python3.9[183279]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:47:32 compute-0 sudo[183277]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:33 compute-0 python3.9[183431]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 06 13:47:33 compute-0 sudo[183581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmyuqusrhgpegamxdjbturhegxwekpec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758453.4070911-2383-140399925650921/AnsiballZ_systemd_service.py'
Oct 06 13:47:33 compute-0 sudo[183581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:34 compute-0 python3.9[183583]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:47:34 compute-0 systemd[1]: Reloading.
Oct 06 13:47:34 compute-0 systemd-rc-local-generator[183608]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:47:34 compute-0 systemd-sysv-generator[183612]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:47:34 compute-0 sudo[183581]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:35 compute-0 sudo[183768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzofharhacmpmpsepcdlpbafrdawpvxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758454.6541529-2399-256326894713367/AnsiballZ_command.py'
Oct 06 13:47:35 compute-0 sudo[183768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:35 compute-0 python3.9[183770]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:47:35 compute-0 sudo[183768]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:35 compute-0 sudo[183921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zragdpdjxlkbhgwbuymswhvporgyjzmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758455.4583018-2399-168777477835754/AnsiballZ_command.py'
Oct 06 13:47:35 compute-0 sudo[183921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:36 compute-0 python3.9[183923]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:47:36 compute-0 sudo[183921]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:36 compute-0 sudo[184074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwalrhxqmlzkdekqmzrsognwhteatgux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758456.2065165-2399-272987992293884/AnsiballZ_command.py'
Oct 06 13:47:36 compute-0 sudo[184074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:36 compute-0 python3.9[184076]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:47:36 compute-0 sudo[184074]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:37 compute-0 sudo[184227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kigkjohtvfejoyjokvkrorypegrmdpcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758456.8964138-2399-80313932095794/AnsiballZ_command.py'
Oct 06 13:47:37 compute-0 sudo[184227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:37 compute-0 python3.9[184229]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:47:37 compute-0 sudo[184227]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:37 compute-0 sudo[184380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qieseuoexzqiabjktcgbktkvjubazodg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758457.6175241-2399-249756395942635/AnsiballZ_command.py'
Oct 06 13:47:37 compute-0 sudo[184380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:38 compute-0 python3.9[184382]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:47:38 compute-0 sudo[184380]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:38 compute-0 sudo[184533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffcprekletpwdeskfxjuxeqtearvyiae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758458.3478732-2399-184987995675631/AnsiballZ_command.py'
Oct 06 13:47:38 compute-0 sudo[184533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:38 compute-0 python3.9[184535]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:47:38 compute-0 sudo[184533]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:39 compute-0 sudo[184686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijttarlyrlpibaoljiebrvgxoxidqawv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758459.121112-2399-70896586559548/AnsiballZ_command.py'
Oct 06 13:47:39 compute-0 sudo[184686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:39 compute-0 python3.9[184688]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:47:39 compute-0 sudo[184686]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:40 compute-0 sudo[184839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwghaymjjwqxmybzngrawpakbcptsnjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758459.8431168-2399-76772886225119/AnsiballZ_command.py'
Oct 06 13:47:40 compute-0 sudo[184839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:40 compute-0 python3.9[184841]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:47:40 compute-0 sudo[184839]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:41 compute-0 sudo[184992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgxnvqiyaulsxxwamcgcoizruukpkyll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758461.5340257-2542-27018446631450/AnsiballZ_file.py'
Oct 06 13:47:41 compute-0 sudo[184992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:42 compute-0 python3.9[184994]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:42 compute-0 sudo[184992]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:42 compute-0 sudo[185144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtdjflnfhimprafjixaodrqlzuapcalt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758462.291971-2542-119500792272350/AnsiballZ_file.py'
Oct 06 13:47:42 compute-0 sudo[185144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:42 compute-0 python3.9[185146]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:42 compute-0 sudo[185144]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:43 compute-0 sudo[185296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hypkwncpdqfzcyccmiiagiqrqvcqdquo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758463.100389-2542-26596149520180/AnsiballZ_file.py'
Oct 06 13:47:43 compute-0 sudo[185296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:43 compute-0 python3.9[185298]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:43 compute-0 sudo[185296]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:44 compute-0 sudo[185448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsostqzegeauwjcqmxxibhrrkchpuxlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758463.89889-2586-222271899780227/AnsiballZ_file.py'
Oct 06 13:47:44 compute-0 sudo[185448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:44 compute-0 python3.9[185450]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:44 compute-0 sudo[185448]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:45 compute-0 sudo[185600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znnuiehudysyipnjpivgtnzktdwulpnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758464.6943712-2586-93640214488443/AnsiballZ_file.py'
Oct 06 13:47:45 compute-0 sudo[185600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:45 compute-0 python3.9[185602]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:45 compute-0 sudo[185600]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:45 compute-0 sudo[185752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rogdrfuqgfjmlibubszfjocblzazukdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758465.5048018-2586-246263678600137/AnsiballZ_file.py'
Oct 06 13:47:45 compute-0 sudo[185752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:46 compute-0 python3.9[185754]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:46 compute-0 sudo[185752]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:46 compute-0 sudo[185904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkymsrhknpakyixmuhvhpuiyxuxsiwzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758466.242785-2586-244275625377698/AnsiballZ_file.py'
Oct 06 13:47:46 compute-0 sudo[185904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:46 compute-0 python3.9[185906]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:46 compute-0 sudo[185904]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:47 compute-0 podman[186031]: 2025-10-06 13:47:47.223923253 +0000 UTC m=+0.073989220 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 13:47:47 compute-0 sudo[186092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwblcarilbhmkzfiunnkcwbxksoeyosy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758466.8958352-2586-181641264660626/AnsiballZ_file.py'
Oct 06 13:47:47 compute-0 sudo[186092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:47 compute-0 podman[186027]: 2025-10-06 13:47:47.309861426 +0000 UTC m=+0.160685513 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 13:47:47 compute-0 python3.9[186100]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:47 compute-0 sudo[186092]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:47 compute-0 sudo[186265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dchlqsqueaiayduedocgciatjstoixlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758467.6657827-2586-255182875378708/AnsiballZ_file.py'
Oct 06 13:47:47 compute-0 sudo[186265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:48 compute-0 podman[186228]: 2025-10-06 13:47:48.004441511 +0000 UTC m=+0.089796019 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 06 13:47:48 compute-0 python3.9[186270]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:48 compute-0 sudo[186265]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:48 compute-0 sudo[186428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvnnaznboyphxbfvehhafqmhvtrsgxxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758468.3515913-2586-152860993518936/AnsiballZ_file.py'
Oct 06 13:47:48 compute-0 sudo[186428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:48 compute-0 python3.9[186430]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:48 compute-0 sudo[186428]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:49 compute-0 sudo[186580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxraqpzumyvwuahhyypwpsdklweuyhem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758469.0714054-2586-36408560271996/AnsiballZ_file.py'
Oct 06 13:47:49 compute-0 sudo[186580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:49 compute-0 python3.9[186582]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:49 compute-0 sudo[186580]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:50 compute-0 sudo[186732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xviqqepgsxgenxxontiypaomajrqkogo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758469.7645426-2586-204274953718498/AnsiballZ_file.py'
Oct 06 13:47:50 compute-0 sudo[186732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:50 compute-0 python3.9[186734]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:47:50 compute-0 sudo[186732]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:54 compute-0 podman[186759]: 2025-10-06 13:47:54.223258994 +0000 UTC m=+0.085053260 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 13:47:55 compute-0 sudo[186904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xynvtyqwoaeogsatnjwwvcyaijhywrwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758474.6144407-2851-93808989018352/AnsiballZ_getent.py'
Oct 06 13:47:55 compute-0 sudo[186904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:55 compute-0 python3.9[186906]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct 06 13:47:55 compute-0 sudo[186904]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:56 compute-0 sudo[187057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlpltxshegxqbfqqvedyweypnehnqlor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758475.479878-2867-193502064829235/AnsiballZ_group.py'
Oct 06 13:47:56 compute-0 sudo[187057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:56 compute-0 python3.9[187059]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 06 13:47:56 compute-0 groupadd[187060]: group added to /etc/group: name=nova, GID=42436
Oct 06 13:47:56 compute-0 groupadd[187060]: group added to /etc/gshadow: name=nova
Oct 06 13:47:56 compute-0 groupadd[187060]: new group: name=nova, GID=42436
Oct 06 13:47:56 compute-0 sudo[187057]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:57 compute-0 sudo[187215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-terszqpogovucuojkuepwqrmndpyomsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758476.7103019-2883-247208419891200/AnsiballZ_user.py'
Oct 06 13:47:57 compute-0 sudo[187215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:47:57 compute-0 python3.9[187217]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 06 13:47:57 compute-0 useradd[187219]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Oct 06 13:47:57 compute-0 useradd[187219]: add 'nova' to group 'libvirt'
Oct 06 13:47:57 compute-0 useradd[187219]: add 'nova' to shadow group 'libvirt'
Oct 06 13:47:57 compute-0 sudo[187215]: pam_unix(sudo:session): session closed for user root
Oct 06 13:47:58 compute-0 sshd-session[187250]: Accepted publickey for zuul from 192.168.122.30 port 39622 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:47:58 compute-0 systemd-logind[789]: New session 27 of user zuul.
Oct 06 13:47:58 compute-0 systemd[1]: Started Session 27 of User zuul.
Oct 06 13:47:58 compute-0 sshd-session[187250]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:47:58 compute-0 sshd-session[187253]: Received disconnect from 192.168.122.30 port 39622:11: disconnected by user
Oct 06 13:47:58 compute-0 sshd-session[187253]: Disconnected from user zuul 192.168.122.30 port 39622
Oct 06 13:47:58 compute-0 sshd-session[187250]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:47:58 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Oct 06 13:47:58 compute-0 systemd-logind[789]: Session 27 logged out. Waiting for processes to exit.
Oct 06 13:47:58 compute-0 systemd-logind[789]: Removed session 27.
Oct 06 13:47:59 compute-0 python3.9[187403]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:47:59 compute-0 python3.9[187524]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758478.855287-2933-20364772304912/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:48:00 compute-0 python3.9[187674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:48:01 compute-0 python3.9[187750]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:48:01 compute-0 python3.9[187900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:48:02 compute-0 python3.9[188021]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758481.3634462-2933-134350647803908/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:48:03 compute-0 python3.9[188171]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:48:03 compute-0 python3.9[188292]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758482.7466779-2933-229682356212487/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:48:04 compute-0 python3.9[188442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:48:05 compute-0 python3.9[188563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758483.9705603-2933-1573554347503/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:48:05 compute-0 sudo[188713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uviwrevvgxhqtehmatakmvqjchtajrwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758485.3561473-3071-63474916764267/AnsiballZ_file.py'
Oct 06 13:48:05 compute-0 sudo[188713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:05 compute-0 python3.9[188715]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:48:05 compute-0 sudo[188713]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:06 compute-0 sudo[188865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzwcmlsukxalltszorjdkfrionevpcly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758486.1215413-3087-45987866271448/AnsiballZ_copy.py'
Oct 06 13:48:06 compute-0 sudo[188865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:06 compute-0 python3.9[188867]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:48:06 compute-0 sudo[188865]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:07 compute-0 sudo[189017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlpfajcqaozcmvdbywlfengkgnabrmce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758486.816572-3103-264063220129659/AnsiballZ_stat.py'
Oct 06 13:48:07 compute-0 sudo[189017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:07 compute-0 python3.9[189019]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:48:07 compute-0 sudo[189017]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:07 compute-0 sudo[189169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovhwqvifhplpqurozvgixrtqdcbhkmxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758487.5353138-3119-255973373645317/AnsiballZ_stat.py'
Oct 06 13:48:07 compute-0 sudo[189169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:08 compute-0 python3.9[189171]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:48:08 compute-0 sudo[189169]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:08 compute-0 sudo[189292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfrhaamdrvbalyisfsncvpiepdfatyqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758487.5353138-3119-255973373645317/AnsiballZ_copy.py'
Oct 06 13:48:08 compute-0 sudo[189292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:08 compute-0 python3.9[189294]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759758487.5353138-3119-255973373645317/.source _original_basename=.ip_f2vt2 follow=False checksum=c7c5e2f2d5e5fb87c87b4a2345a4160728b9c2f3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct 06 13:48:08 compute-0 sudo[189292]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:09 compute-0 python3.9[189446]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:48:10 compute-0 python3.9[189598]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:48:10 compute-0 auditd[702]: Audit daemon rotating log files
Oct 06 13:48:11 compute-0 python3.9[189719]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758489.8359797-3171-25194259334989/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=ccc524d9f73e469e06a3336d0b85c40bd4e0e56a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:48:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:48:11.325 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:48:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:48:11.326 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:48:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:48:11.326 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:48:11 compute-0 python3.9[189870]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:48:12 compute-0 python3.9[189991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758491.2856665-3201-176093154297809/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=45e514538ff5ecb7b84ee77d8501cf7833099300 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:48:13 compute-0 sudo[190141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzrxfrbycqirusoybbsbmtregqpnlean ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758492.849833-3235-1618333633222/AnsiballZ_container_config_data.py'
Oct 06 13:48:13 compute-0 sudo[190141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:13 compute-0 python3.9[190143]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct 06 13:48:13 compute-0 sudo[190141]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:14 compute-0 sudo[190293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fheigqlodwoxbrpuzxxquboemqqlvsee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758493.6636403-3253-250443322225977/AnsiballZ_container_config_hash.py'
Oct 06 13:48:14 compute-0 sudo[190293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:14 compute-0 python3.9[190295]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 06 13:48:14 compute-0 sudo[190293]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:14 compute-0 sudo[190445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwdlydnhtxguflacbupxpkgntdcdwkzz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759758494.6293652-3273-127991095728152/AnsiballZ_edpm_container_manage.py'
Oct 06 13:48:14 compute-0 sudo[190445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:15 compute-0 python3[190447]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct 06 13:48:15 compute-0 podman[190480]: 2025-10-06 13:48:15.493409375 +0000 UTC m=+0.031178787 image pull 920af19e5030aa8d226c8406b11c407c332317d692c620edd5e546aed379868d 38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Oct 06 13:48:15 compute-0 podman[190480]: 2025-10-06 13:48:15.658056925 +0000 UTC m=+0.195826357 container create 46985cf1902553f19efc0ff938df26f865c97bbbe72cd927c58fd9cb1b4211f6 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=nova_compute_init, managed_by=edpm_ansible, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 06 13:48:15 compute-0 python3[190447]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct 06 13:48:15 compute-0 sudo[190445]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:16 compute-0 sudo[190668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyywaxetpanmsyaxkbfaoabexssrsdux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758496.077851-3289-5370942625928/AnsiballZ_stat.py'
Oct 06 13:48:16 compute-0 sudo[190668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:16 compute-0 python3.9[190670]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:48:16 compute-0 sudo[190668]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:17 compute-0 sudo[190843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdyqhiczvlpwecicbjyezccxryqrlrrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758497.1104622-3313-270887562294588/AnsiballZ_container_config_data.py'
Oct 06 13:48:17 compute-0 sudo[190843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:17 compute-0 podman[190797]: 2025-10-06 13:48:17.477152724 +0000 UTC m=+0.085631025 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 13:48:17 compute-0 podman[190796]: 2025-10-06 13:48:17.514564629 +0000 UTC m=+0.123726318 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 06 13:48:17 compute-0 python3.9[190845]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct 06 13:48:17 compute-0 sudo[190843]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:18 compute-0 sudo[191031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqqfyxkllwgnskzjrvzezwserltukclw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758497.91185-3331-218176731474841/AnsiballZ_container_config_hash.py'
Oct 06 13:48:18 compute-0 sudo[191031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:18 compute-0 podman[190979]: 2025-10-06 13:48:18.23881494 +0000 UTC m=+0.088866284 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 06 13:48:18 compute-0 python3.9[191039]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 06 13:48:18 compute-0 sudo[191031]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:19 compute-0 sudo[191192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnxnbrzhaupsoiquddafoeqykcihfhcj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759758498.7923899-3351-48279279515005/AnsiballZ_edpm_container_manage.py'
Oct 06 13:48:19 compute-0 sudo[191192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:19 compute-0 python3[191194]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 06 13:48:19 compute-0 podman[191231]: 2025-10-06 13:48:19.614081753 +0000 UTC m=+0.059912938 container create 923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=nova_compute, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Oct 06 13:48:19 compute-0 podman[191231]: 2025-10-06 13:48:19.589085374 +0000 UTC m=+0.034916589 image pull 920af19e5030aa8d226c8406b11c407c332317d692c620edd5e546aed379868d 38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Oct 06 13:48:19 compute-0 python3[191194]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Oct 06 13:48:19 compute-0 sudo[191192]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:20 compute-0 sudo[191419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odujqmghrbkxpmfoqudhjukcgeecrhor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758500.001393-3367-265715854205971/AnsiballZ_stat.py'
Oct 06 13:48:20 compute-0 sudo[191419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:20 compute-0 python3.9[191421]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:48:20 compute-0 sudo[191419]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:21 compute-0 sudo[191573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtbeigftlmkscflaygyivihrfsmvsyal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758500.967029-3385-110339684415743/AnsiballZ_file.py'
Oct 06 13:48:21 compute-0 sudo[191573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:21 compute-0 python3.9[191575]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:48:21 compute-0 sudo[191573]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:22 compute-0 sudo[191724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvrrqqepnxmuortigzhaetbfnanyaooe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758501.6474931-3385-24138084418814/AnsiballZ_copy.py'
Oct 06 13:48:22 compute-0 sudo[191724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:22 compute-0 python3.9[191726]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759758501.6474931-3385-24138084418814/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:48:22 compute-0 sudo[191724]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:22 compute-0 sudo[191800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpifptzswwrupuwzldxjrhmuaaykikdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758501.6474931-3385-24138084418814/AnsiballZ_systemd.py'
Oct 06 13:48:22 compute-0 sudo[191800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:23 compute-0 python3.9[191802]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:48:23 compute-0 systemd[1]: Reloading.
Oct 06 13:48:23 compute-0 systemd-rc-local-generator[191832]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:48:23 compute-0 systemd-sysv-generator[191836]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:48:23 compute-0 sudo[191800]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:23 compute-0 sudo[191911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyngsfkzegurceuadufgzgxgterqzjze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758501.6474931-3385-24138084418814/AnsiballZ_systemd.py'
Oct 06 13:48:23 compute-0 sudo[191911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:24 compute-0 python3.9[191913]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:48:24 compute-0 systemd[1]: Reloading.
Oct 06 13:48:24 compute-0 systemd-rc-local-generator[191943]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:48:24 compute-0 systemd-sysv-generator[191946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:48:24 compute-0 systemd[1]: Starting nova_compute container...
Oct 06 13:48:24 compute-0 podman[191951]: 2025-10-06 13:48:24.652308218 +0000 UTC m=+0.174426215 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 06 13:48:24 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:48:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:24 compute-0 podman[191953]: 2025-10-06 13:48:24.767193716 +0000 UTC m=+0.275923060 container init 923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=edpm)
Oct 06 13:48:24 compute-0 podman[191953]: 2025-10-06 13:48:24.775304697 +0000 UTC m=+0.284034001 container start 923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=edpm, container_name=nova_compute, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 13:48:24 compute-0 nova_compute[191987]: + sudo -E kolla_set_configs
Oct 06 13:48:24 compute-0 nova_compute[191987]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 06 13:48:24 compute-0 nova_compute[191987]: INFO:__main__:Validating config file
Oct 06 13:48:24 compute-0 nova_compute[191987]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 06 13:48:24 compute-0 nova_compute[191987]: INFO:__main__:Copying service configuration files
Oct 06 13:48:24 compute-0 nova_compute[191987]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 06 13:48:25 compute-0 podman[191953]: nova_compute
Oct 06 13:48:25 compute-0 systemd[1]: Started nova_compute container.
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Deleting /etc/ceph
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Creating directory /etc/ceph
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /etc/ceph
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Writing out command to execute
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 06 13:48:25 compute-0 nova_compute[191987]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 06 13:48:25 compute-0 nova_compute[191987]: ++ cat /run_command
Oct 06 13:48:25 compute-0 sudo[191911]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:25 compute-0 nova_compute[191987]: + CMD=nova-compute
Oct 06 13:48:25 compute-0 nova_compute[191987]: + ARGS=
Oct 06 13:48:25 compute-0 nova_compute[191987]: + sudo kolla_copy_cacerts
Oct 06 13:48:25 compute-0 nova_compute[191987]: + [[ ! -n '' ]]
Oct 06 13:48:25 compute-0 nova_compute[191987]: + . kolla_extend_start
Oct 06 13:48:25 compute-0 nova_compute[191987]: + echo 'Running command: '\''nova-compute'\'''
Oct 06 13:48:25 compute-0 nova_compute[191987]: Running command: 'nova-compute'
Oct 06 13:48:25 compute-0 nova_compute[191987]: + umask 0022
Oct 06 13:48:25 compute-0 nova_compute[191987]: + exec nova-compute
Oct 06 13:48:26 compute-0 python3.9[192148]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:48:26 compute-0 python3.9[192298]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:48:27 compute-0 nova_compute[191987]: 2025-10-06 13:48:27.061 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 06 13:48:27 compute-0 nova_compute[191987]: 2025-10-06 13:48:27.061 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 06 13:48:27 compute-0 nova_compute[191987]: 2025-10-06 13:48:27.061 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 06 13:48:27 compute-0 nova_compute[191987]: 2025-10-06 13:48:27.061 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 06 13:48:27 compute-0 nova_compute[191987]: 2025-10-06 13:48:27.179 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:48:27 compute-0 nova_compute[191987]: 2025-10-06 13:48:27.194 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:48:27 compute-0 nova_compute[191987]: 2025-10-06 13:48:27.225 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Oct 06 13:48:27 compute-0 nova_compute[191987]: 2025-10-06 13:48:27.226 2 WARNING oslo_config.cfg [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Oct 06 13:48:27 compute-0 python3.9[192451]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.220 2 INFO nova.virt.driver [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 06 13:48:28 compute-0 sudo[192603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bafyhodjudicnkbdadngebsgvmukyoht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758507.9638805-3505-155522328121239/AnsiballZ_podman_container.py'
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.304 2 INFO nova.compute.provider_config [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 06 13:48:28 compute-0 sudo[192603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:28 compute-0 python3.9[192605]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 06 13:48:28 compute-0 sudo[192603]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:28 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 13:48:28 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.812 2 DEBUG oslo_concurrency.lockutils [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.813 2 DEBUG oslo_concurrency.lockutils [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.813 2 DEBUG oslo_concurrency.lockutils [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.814 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.814 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.815 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.815 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.815 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.816 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.816 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.816 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.816 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.817 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.817 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.817 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.817 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.818 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.818 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.818 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.819 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.819 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.819 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.819 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.820 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.820 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.820 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.820 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.821 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.821 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.821 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.821 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.822 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.822 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.822 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.823 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.823 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.823 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.823 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.824 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.824 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.824 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.824 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.825 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.825 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.825 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.826 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.826 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.826 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.826 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.827 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.827 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.827 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.828 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.828 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.828 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.828 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.829 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.829 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.829 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.830 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.830 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.830 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.830 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.831 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.831 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.831 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.831 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.832 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.832 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.832 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.832 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.833 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.833 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.833 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.833 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.834 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.834 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.834 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.834 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.835 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.835 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.835 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.835 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.836 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.836 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.836 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.836 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.837 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.837 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.837 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.837 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.838 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.838 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.838 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.839 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.839 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.839 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.839 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.840 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.840 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.840 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.840 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.841 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.841 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.841 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.841 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.842 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.842 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.842 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.842 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.843 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.843 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.843 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.843 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.844 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.844 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.844 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.844 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.845 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.845 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.845 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.845 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.846 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.846 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.846 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.847 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.847 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.847 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.847 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.848 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.848 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.848 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.848 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.849 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.849 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.849 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.850 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.850 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.850 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.850 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.851 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.851 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.851 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.851 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.852 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.852 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.852 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.852 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.853 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.853 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.853 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.853 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.854 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.854 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.854 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.855 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.855 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.855 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.855 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.855 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.855 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.856 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.856 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.856 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.856 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.856 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.856 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.857 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.857 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.857 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.857 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.857 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.857 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.857 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.857 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.857 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.858 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.858 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.858 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.858 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.858 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.858 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.858 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.858 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.859 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.859 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.859 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.859 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.859 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.859 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.859 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.859 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.859 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.860 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.860 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.860 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.860 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.860 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.860 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.860 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.860 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.861 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.861 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.861 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.861 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.861 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.861 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.861 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.861 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.861 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.862 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.862 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.862 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.862 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.862 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.862 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.862 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.862 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.863 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.863 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.863 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.863 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.863 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.863 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.863 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.863 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.863 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.864 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.864 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.864 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.864 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.864 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.864 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.864 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.864 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.864 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.865 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.865 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.865 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.865 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.865 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.865 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.865 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.865 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.866 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.866 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.866 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.866 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.866 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.866 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.866 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.866 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.867 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.867 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.867 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.867 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.867 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.867 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.867 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.867 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.867 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.868 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.868 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.868 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.868 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.868 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.868 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.868 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.868 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.868 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.869 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.869 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.869 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.869 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.869 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.869 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.869 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.869 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.869 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.870 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.870 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.870 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.870 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.870 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.870 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.870 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.870 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.870 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.871 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.871 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.871 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.871 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.871 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.871 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.871 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.871 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.871 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.872 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.872 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.872 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.872 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.872 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.872 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.872 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.872 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.872 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.873 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.873 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.873 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.873 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.873 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.873 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.873 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.873 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.873 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.874 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.874 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.874 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.874 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.874 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.874 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.874 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.875 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.875 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.875 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.875 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.875 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.875 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.875 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.876 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.876 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.876 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.876 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.876 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.877 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.877 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.878 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.878 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.878 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.878 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.878 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.878 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.878 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.878 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.878 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.879 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.879 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.879 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.879 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.879 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.879 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.879 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.879 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.880 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.880 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.880 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.880 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.880 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.880 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.880 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.880 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.880 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.881 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.881 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.881 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.881 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.881 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.881 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.881 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.881 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.881 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.882 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.882 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.882 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.882 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.882 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.882 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.882 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.882 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.882 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.883 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.883 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.883 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.883 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.883 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.883 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.883 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.884 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.884 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.884 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.884 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.884 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.884 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.884 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.884 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.884 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.885 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.885 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.885 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.885 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.885 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.885 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.885 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.885 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.885 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.886 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.886 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.886 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.886 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.886 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.886 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.886 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.886 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.886 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.887 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.887 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.887 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.887 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.887 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.887 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.887 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.887 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.887 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.888 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.888 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.888 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.888 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.888 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.888 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.888 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.889 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.889 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.889 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.889 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.889 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.889 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.889 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.889 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.889 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.890 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.890 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.890 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.890 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.890 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.890 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.890 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.890 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.891 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.891 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.891 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.891 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.891 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.891 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.891 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.891 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.891 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.892 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.892 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.892 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.892 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.892 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.892 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.892 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.892 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.893 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.893 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.893 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.893 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.893 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.893 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.893 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.893 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.893 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.894 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.894 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.894 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.894 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.894 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.894 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.894 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.895 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.895 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.895 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.895 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.895 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.895 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.895 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.895 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.895 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.896 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.896 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.896 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.896 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.896 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.896 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.896 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.896 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.896 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.897 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.897 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.897 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.897 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.897 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.897 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.897 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.897 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.898 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.898 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.898 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.898 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.898 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.898 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.898 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.898 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.899 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.899 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.899 2 WARNING oslo_config.cfg [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 06 13:48:28 compute-0 nova_compute[191987]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 06 13:48:28 compute-0 nova_compute[191987]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 06 13:48:28 compute-0 nova_compute[191987]: and ``live_migration_inbound_addr`` respectively.
Oct 06 13:48:28 compute-0 nova_compute[191987]: ).  Its value may be silently ignored in the future.
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.899 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.899 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.899 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.899 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.900 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.900 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.900 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.900 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.900 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.900 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.900 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.900 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.901 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.901 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.901 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.901 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.901 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.901 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.901 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.901 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.901 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.902 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.902 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.902 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.902 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.902 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.902 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.902 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.902 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.903 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.903 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.903 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.903 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.903 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.903 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.903 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.903 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.903 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.904 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.904 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.904 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.904 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.904 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.904 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.904 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.904 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.904 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.905 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.905 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.905 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.905 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.905 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.905 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.905 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.906 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.906 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.906 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.906 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.906 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.906 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.906 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.906 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.906 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.907 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.907 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.907 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.907 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.907 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.907 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.907 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.907 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.907 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.908 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.908 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.908 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.908 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.908 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.908 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.908 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.908 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.908 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.909 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.909 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.909 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.909 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.909 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.909 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.909 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.909 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.910 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.910 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.910 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.910 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.910 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.910 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.910 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.910 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.910 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.911 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.911 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.911 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.911 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.911 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.911 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.911 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.911 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.911 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.912 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.912 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.912 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.912 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.912 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.912 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.912 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.912 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.912 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.913 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.914 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.914 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.914 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.914 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.914 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.914 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.914 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.914 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.915 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.915 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.915 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.915 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.915 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.915 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.915 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.915 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.915 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.916 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.916 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.916 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.916 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.916 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.916 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.916 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.916 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.916 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.917 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.917 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.917 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.917 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.917 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.917 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.917 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.917 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.917 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.918 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.918 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.918 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.918 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.918 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.918 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.918 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.918 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.919 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.919 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.919 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.919 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.919 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.919 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.919 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.919 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.919 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.920 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.920 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.920 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.920 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.920 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.920 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.920 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.920 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.920 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.921 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.921 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.921 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.921 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.921 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.921 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.921 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.921 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.922 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.922 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.922 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.922 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.922 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.922 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.922 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.922 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.922 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.923 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.923 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.923 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.923 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.923 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.923 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.923 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.923 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.924 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.924 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.924 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.924 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.924 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.924 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.924 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.924 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.925 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.925 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.925 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.925 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.925 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.925 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.925 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.925 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.925 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.926 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.926 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.926 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.926 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.926 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.926 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.926 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.926 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.926 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.927 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.927 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.927 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.927 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.927 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.927 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.927 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.927 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.928 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.928 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.928 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.928 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.928 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.928 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.928 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.928 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.928 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.929 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.929 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.929 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.929 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.929 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.929 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.930 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.930 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.930 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.930 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.930 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.930 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.930 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.930 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.931 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.931 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.931 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.931 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.931 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.931 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.931 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.931 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.932 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.932 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.932 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.932 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.932 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.932 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.932 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.932 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.932 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.933 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.933 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.933 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.933 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.933 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.933 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.933 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.933 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.933 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.934 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.934 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.934 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.934 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.934 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.934 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.934 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.934 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.934 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.935 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.935 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.935 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.935 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.935 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.935 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.935 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.935 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.935 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.936 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.936 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.936 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.936 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.936 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.936 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.936 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.936 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.936 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.937 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.937 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.937 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.937 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.937 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.937 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.937 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.937 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.937 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.938 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.938 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.938 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.938 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.938 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.938 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.938 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.938 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.939 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.939 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.939 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.939 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.939 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.939 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.939 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.939 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.940 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.940 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.940 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.940 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.940 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.940 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.940 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.940 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.940 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.941 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.941 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.941 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.941 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.941 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.941 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.941 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.941 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.941 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.942 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.942 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.942 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.942 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.942 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.942 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.942 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.942 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.942 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.943 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.943 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.943 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.943 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.943 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.943 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.943 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.944 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.944 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.944 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.944 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.944 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.944 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.944 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.944 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.945 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.945 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.945 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.945 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.945 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.945 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.945 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.945 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.946 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.946 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.946 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.946 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.946 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.946 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.946 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.946 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.946 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.947 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.947 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.947 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.947 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.947 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.947 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.947 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.947 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.948 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.948 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.948 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.948 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.948 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.948 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.948 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.948 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.948 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.949 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.949 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.949 2 DEBUG oslo_service.backend._eventlet.service [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Oct 06 13:48:28 compute-0 nova_compute[191987]: 2025-10-06 13:48:28.950 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251002161230.cc74260.el10)
Oct 06 13:48:29 compute-0 sudo[192779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvjsviywzyijbvmjaotqetyvavbzqlch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758508.9676936-3521-159810901173874/AnsiballZ_systemd.py'
Oct 06 13:48:29 compute-0 sudo[192779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.464 2 DEBUG nova.virt.libvirt.host [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Oct 06 13:48:29 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 06 13:48:29 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.533 2 DEBUG nova.virt.libvirt.host [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f59bd0554f0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Oct 06 13:48:29 compute-0 nova_compute[191987]: libvirt:  error : internal error: could not initialize domain event timer
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.534 2 WARNING nova.virt.libvirt.host [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.534 2 DEBUG nova.virt.libvirt.host [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f59bd0554f0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.536 2 DEBUG nova.virt.libvirt.host [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.536 2 DEBUG nova.virt.libvirt.host [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.536 2 INFO nova.utils [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] The default thread pool MainProcess.default is initialized
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.537 2 DEBUG nova.virt.libvirt.host [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.537 2 INFO nova.virt.libvirt.driver [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] Connection event '1' reason 'None'
Oct 06 13:48:29 compute-0 python3.9[192781]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:48:29 compute-0 systemd[1]: Stopping nova_compute container...
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.686 2 DEBUG oslo_concurrency.lockutils [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.687 2 DEBUG oslo_concurrency.lockutils [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 13:48:29 compute-0 nova_compute[191987]: 2025-10-06 13:48:29.687 2 DEBUG oslo_concurrency.lockutils [None req-6f80a858-f65d-43ce-9f66-fa2645c543a1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 13:48:30 compute-0 nova_compute[191987]: 2025-10-06 13:48:30.458 2 WARNING nova.virt.libvirt.driver [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 06 13:48:30 compute-0 nova_compute[191987]: 2025-10-06 13:48:30.459 2 DEBUG nova.virt.libvirt.volume.mount [None req-efec74ab-20f6-4def-bfea-0dbe400b228b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 06 13:48:30 compute-0 virtqemud[192802]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct 06 13:48:30 compute-0 virtqemud[192802]: hostname: compute-0
Oct 06 13:48:30 compute-0 virtqemud[192802]: End of file while reading data: Input/output error
Oct 06 13:48:30 compute-0 systemd[1]: libpod-923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053.scope: Deactivated successfully.
Oct 06 13:48:30 compute-0 systemd[1]: libpod-923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053.scope: Consumed 3.043s CPU time.
Oct 06 13:48:30 compute-0 podman[192836]: 2025-10-06 13:48:30.927404759 +0000 UTC m=+1.287171742 container died 923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4)
Oct 06 13:48:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053-userdata-shm.mount: Deactivated successfully.
Oct 06 13:48:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e-merged.mount: Deactivated successfully.
Oct 06 13:48:30 compute-0 podman[192836]: 2025-10-06 13:48:30.979027551 +0000 UTC m=+1.338794544 container cleanup 923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 13:48:30 compute-0 podman[192836]: nova_compute
Oct 06 13:48:31 compute-0 podman[192874]: nova_compute
Oct 06 13:48:31 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct 06 13:48:31 compute-0 systemd[1]: Stopped nova_compute container.
Oct 06 13:48:31 compute-0 systemd[1]: Starting nova_compute container...
Oct 06 13:48:31 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:48:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9312d196994e00775e0bf15cb9eb9bf4cd9d604eee4953e1d7f1e8688a85617e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:31 compute-0 podman[192888]: 2025-10-06 13:48:31.216380654 +0000 UTC m=+0.110974604 container init 923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:48:31 compute-0 podman[192888]: 2025-10-06 13:48:31.230829506 +0000 UTC m=+0.125423426 container start 923a692a0b12a32fa5bcc7107a495ce0b03fc16ef9821f0132e724278c92d053 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Oct 06 13:48:31 compute-0 podman[192888]: nova_compute
Oct 06 13:48:31 compute-0 nova_compute[192903]: + sudo -E kolla_set_configs
Oct 06 13:48:31 compute-0 systemd[1]: Started nova_compute container.
Oct 06 13:48:31 compute-0 sudo[192779]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Validating config file
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Copying service configuration files
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Deleting /etc/ceph
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Creating directory /etc/ceph
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /etc/ceph
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Writing out command to execute
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 06 13:48:31 compute-0 nova_compute[192903]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 06 13:48:31 compute-0 nova_compute[192903]: ++ cat /run_command
Oct 06 13:48:31 compute-0 nova_compute[192903]: + CMD=nova-compute
Oct 06 13:48:31 compute-0 nova_compute[192903]: + ARGS=
Oct 06 13:48:31 compute-0 nova_compute[192903]: + sudo kolla_copy_cacerts
Oct 06 13:48:31 compute-0 nova_compute[192903]: + [[ ! -n '' ]]
Oct 06 13:48:31 compute-0 nova_compute[192903]: + . kolla_extend_start
Oct 06 13:48:31 compute-0 nova_compute[192903]: + echo 'Running command: '\''nova-compute'\'''
Oct 06 13:48:31 compute-0 nova_compute[192903]: Running command: 'nova-compute'
Oct 06 13:48:31 compute-0 nova_compute[192903]: + umask 0022
Oct 06 13:48:31 compute-0 nova_compute[192903]: + exec nova-compute
Oct 06 13:48:31 compute-0 sudo[193064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dknrxwgjubcuooieauuuewjkcxpwhlrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758511.5504806-3539-207416953940404/AnsiballZ_podman_container.py'
Oct 06 13:48:31 compute-0 sudo[193064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:32 compute-0 python3.9[193066]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 06 13:48:32 compute-0 systemd[1]: Started libpod-conmon-46985cf1902553f19efc0ff938df26f865c97bbbe72cd927c58fd9cb1b4211f6.scope.
Oct 06 13:48:32 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:48:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebc3f6cfa9931fa3fc11b9c3a249a6d2bd5cff5090298dfc54ee9b6071c48fd2/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebc3f6cfa9931fa3fc11b9c3a249a6d2bd5cff5090298dfc54ee9b6071c48fd2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebc3f6cfa9931fa3fc11b9c3a249a6d2bd5cff5090298dfc54ee9b6071c48fd2/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 06 13:48:32 compute-0 podman[193091]: 2025-10-06 13:48:32.401268208 +0000 UTC m=+0.155436431 container init 46985cf1902553f19efc0ff938df26f865c97bbbe72cd927c58fd9cb1b4211f6 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 06 13:48:32 compute-0 podman[193091]: 2025-10-06 13:48:32.414054745 +0000 UTC m=+0.168222908 container start 46985cf1902553f19efc0ff938df26f865c97bbbe72cd927c58fd9cb1b4211f6 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, tcib_build_tag=watcher_latest, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible)
Oct 06 13:48:32 compute-0 python3.9[193066]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Applying nova statedir ownership
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct 06 13:48:32 compute-0 nova_compute_init[193112]: INFO:nova_statedir:Nova statedir ownership complete
Oct 06 13:48:32 compute-0 systemd[1]: libpod-46985cf1902553f19efc0ff938df26f865c97bbbe72cd927c58fd9cb1b4211f6.scope: Deactivated successfully.
Oct 06 13:48:32 compute-0 podman[193127]: 2025-10-06 13:48:32.542516272 +0000 UTC m=+0.028230497 container died 46985cf1902553f19efc0ff938df26f865c97bbbe72cd927c58fd9cb1b4211f6 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 13:48:32 compute-0 sudo[193064]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46985cf1902553f19efc0ff938df26f865c97bbbe72cd927c58fd9cb1b4211f6-userdata-shm.mount: Deactivated successfully.
Oct 06 13:48:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-ebc3f6cfa9931fa3fc11b9c3a249a6d2bd5cff5090298dfc54ee9b6071c48fd2-merged.mount: Deactivated successfully.
Oct 06 13:48:32 compute-0 podman[193127]: 2025-10-06 13:48:32.817869277 +0000 UTC m=+0.303583542 container cleanup 46985cf1902553f19efc0ff938df26f865c97bbbe72cd927c58fd9cb1b4211f6 (image=38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.151:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4)
Oct 06 13:48:32 compute-0 systemd[1]: libpod-conmon-46985cf1902553f19efc0ff938df26f865c97bbbe72cd927c58fd9cb1b4211f6.scope: Deactivated successfully.
Oct 06 13:48:33 compute-0 sshd-session[158522]: Connection closed by 192.168.122.30 port 60734
Oct 06 13:48:33 compute-0 sshd-session[158519]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:48:33 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Oct 06 13:48:33 compute-0 systemd[1]: session-25.scope: Consumed 2min 39.532s CPU time.
Oct 06 13:48:33 compute-0 systemd-logind[789]: Session 25 logged out. Waiting for processes to exit.
Oct 06 13:48:33 compute-0 systemd-logind[789]: Removed session 25.
Oct 06 13:48:33 compute-0 nova_compute[192903]: 2025-10-06 13:48:33.402 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 06 13:48:33 compute-0 nova_compute[192903]: 2025-10-06 13:48:33.403 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 06 13:48:33 compute-0 nova_compute[192903]: 2025-10-06 13:48:33.403 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 06 13:48:33 compute-0 nova_compute[192903]: 2025-10-06 13:48:33.403 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 06 13:48:33 compute-0 nova_compute[192903]: 2025-10-06 13:48:33.523 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:48:33 compute-0 nova_compute[192903]: 2025-10-06 13:48:33.550 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:48:33 compute-0 nova_compute[192903]: 2025-10-06 13:48:33.580 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Oct 06 13:48:33 compute-0 nova_compute[192903]: 2025-10-06 13:48:33.581 2 WARNING oslo_config.cfg [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Oct 06 13:48:34 compute-0 nova_compute[192903]: 2025-10-06 13:48:34.582 2 INFO nova.virt.driver [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 06 13:48:34 compute-0 nova_compute[192903]: 2025-10-06 13:48:34.689 2 INFO nova.compute.provider_config [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.206 2 DEBUG oslo_concurrency.lockutils [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.207 2 DEBUG oslo_concurrency.lockutils [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.208 2 DEBUG oslo_concurrency.lockutils [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.208 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.208 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.209 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.209 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.210 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.210 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.210 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.211 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.211 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.211 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.212 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.212 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.212 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.213 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.213 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.213 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.214 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.214 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.215 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.215 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.215 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.216 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.216 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.216 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.217 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.217 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.217 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.218 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.218 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.219 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.219 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.219 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.220 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.220 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.220 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.221 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.221 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.221 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.222 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.222 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.223 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.223 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.223 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.224 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.224 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.224 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.225 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.225 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.226 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.226 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.226 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.227 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.227 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.228 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.228 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.228 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.229 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.229 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.229 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.230 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.230 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.230 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.231 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.231 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.231 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.232 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.232 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.232 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.233 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.233 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.233 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.234 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.234 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.234 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.235 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.235 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.235 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.236 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.236 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.236 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.237 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.237 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.237 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.238 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.238 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.238 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.239 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.239 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.239 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.239 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.240 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.240 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.240 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.240 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.241 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.241 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.241 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.241 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.242 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.242 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.242 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.242 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.243 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.243 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.243 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.243 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.244 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.244 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.244 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.244 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.245 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.245 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.245 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.246 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.246 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.247 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.247 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.247 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.247 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.248 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.248 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.248 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.249 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.249 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.249 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.249 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.250 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.250 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.250 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.250 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.251 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.251 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.251 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.251 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.252 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.252 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.252 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.253 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.253 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.253 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.253 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.254 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.254 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.254 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.254 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.255 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.255 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.255 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.256 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.256 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.256 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.256 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.257 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.257 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.257 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.258 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.258 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.258 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.258 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.259 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.259 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.259 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.259 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.260 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.260 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.260 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.260 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.261 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.261 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.261 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.262 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.262 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.262 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.262 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.262 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.263 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.263 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.263 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.263 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.264 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.264 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.264 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.264 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.265 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.265 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.265 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.265 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.266 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.266 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.266 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.266 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.267 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.267 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.267 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.267 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.267 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.268 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.268 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.268 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.268 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.269 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.269 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.269 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.269 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.270 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.270 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.270 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.270 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.271 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.271 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.271 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.271 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.272 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.272 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.272 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.272 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.272 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.273 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.273 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.273 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.273 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.274 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.275 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.275 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.275 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.276 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.276 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.276 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.276 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.277 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.277 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.277 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.278 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.278 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.278 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.278 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.278 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.278 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.279 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.279 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.279 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.279 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.279 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.279 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.280 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.280 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.280 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.280 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.280 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.280 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.281 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.281 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.281 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.281 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.281 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.281 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.282 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.282 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.282 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.282 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.282 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.282 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.283 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.283 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.283 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.283 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.283 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.283 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.283 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.284 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.284 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.284 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.284 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.285 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.285 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.285 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.285 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.285 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.286 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.286 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.286 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.286 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.286 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.286 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.287 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.287 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.287 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.287 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.287 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.287 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.288 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.288 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.288 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.288 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.288 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.288 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.289 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.289 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.289 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.289 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.289 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.289 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.290 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.290 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.290 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.290 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.290 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.290 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.291 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.291 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.291 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.291 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.292 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.292 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.292 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.292 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.292 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.292 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.293 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.293 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.293 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.293 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.293 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.294 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.294 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.294 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.294 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.294 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.294 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.295 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.295 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.296 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.297 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.297 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.297 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.297 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.297 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.297 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.298 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.298 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.298 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.298 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.298 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.298 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.298 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.299 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.299 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.299 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.299 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.299 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.299 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.299 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.300 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.300 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.300 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.300 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.300 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.300 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.300 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.301 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.301 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.301 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.301 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.301 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.302 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.302 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.302 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.302 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.302 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.302 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.302 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.302 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.303 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.303 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.303 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.303 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.303 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.303 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.304 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.304 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.304 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.304 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.304 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.304 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.304 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.305 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.305 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.305 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.305 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.305 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.305 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.305 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.305 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.306 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.306 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.306 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.306 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.307 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.307 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.307 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.307 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.307 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.307 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.307 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.308 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.308 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.308 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.308 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.308 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.308 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.308 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.309 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.309 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.310 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.310 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.310 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.310 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.310 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.310 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.311 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.311 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.311 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.311 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.311 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.311 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.311 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.311 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.312 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.312 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.312 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.312 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.312 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.312 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.312 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.313 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.313 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.313 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.313 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.313 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.313 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.313 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.313 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.314 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.314 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.314 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.314 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.314 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.314 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.314 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.314 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.315 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.315 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.315 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.315 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.315 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.315 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.315 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.315 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.316 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.316 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.316 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.316 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.316 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.316 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.316 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.316 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.317 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.317 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.317 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.317 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.317 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.317 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.317 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.318 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.318 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.318 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.318 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.318 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.318 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.318 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.319 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.319 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.319 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.319 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.319 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.319 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.319 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.319 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.320 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.320 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.320 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.320 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.320 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.320 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.320 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.320 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.320 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.321 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.321 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.321 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.321 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.321 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.321 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.321 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.321 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.322 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.322 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.322 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.322 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.322 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.322 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.322 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.322 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.323 2 WARNING oslo_config.cfg [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 06 13:48:35 compute-0 nova_compute[192903]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 06 13:48:35 compute-0 nova_compute[192903]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 06 13:48:35 compute-0 nova_compute[192903]: and ``live_migration_inbound_addr`` respectively.
Oct 06 13:48:35 compute-0 nova_compute[192903]: ).  Its value may be silently ignored in the future.
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.323 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.323 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.323 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.323 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.323 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.323 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.324 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.324 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.324 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.324 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.324 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.324 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.324 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.324 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.324 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.325 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.325 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.325 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.325 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.325 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.325 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.325 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.325 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.325 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.326 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.326 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.326 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.326 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.326 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.326 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.326 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.326 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.327 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.327 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.327 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.327 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.327 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.327 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.327 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.327 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.327 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.328 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.328 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.328 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.328 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.328 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.328 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.328 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.328 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.329 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.329 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.329 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.329 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.329 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.329 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.329 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.329 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.329 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.330 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.330 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.330 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.330 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.330 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.330 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.330 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.330 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.330 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.331 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.331 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.331 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.331 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.331 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.331 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.331 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.331 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.331 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.332 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.332 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.332 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.332 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.332 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.332 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.332 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.332 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.332 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.333 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.333 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.333 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.333 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.333 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.333 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.333 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.333 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.334 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.334 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.334 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.334 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.334 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.334 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.334 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.334 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.334 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.335 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.335 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.335 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.335 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.335 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.335 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.335 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.335 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.335 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.336 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.336 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.336 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.336 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.336 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.336 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.336 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.336 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.336 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.337 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.337 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.337 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.337 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.337 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.337 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.337 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.337 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.338 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.338 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.338 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.338 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.338 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.338 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.338 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.338 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.339 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.339 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.339 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.339 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.339 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.339 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.339 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.339 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.339 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.340 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.340 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.340 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.340 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.340 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.340 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.340 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.340 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.340 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.341 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.341 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.341 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.341 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.341 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.341 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.341 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.341 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.342 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.342 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.342 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.342 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.342 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.342 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.342 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.342 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.343 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.343 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.343 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.343 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.343 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.343 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.343 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.343 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.343 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.344 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.344 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.344 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.344 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.344 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.344 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.344 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.344 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.344 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.345 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.345 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.345 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.345 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.345 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.345 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.345 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.346 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.346 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.346 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.346 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.346 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.346 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.346 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.346 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.346 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.347 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.347 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.347 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.347 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.347 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.347 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.347 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.347 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.348 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.348 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.348 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.348 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.348 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.348 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.348 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.348 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.349 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.349 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.349 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.349 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.349 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.349 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.349 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.349 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.349 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.350 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.350 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.350 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.350 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.350 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.350 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.350 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.350 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.350 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.351 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.351 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.351 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.351 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.351 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.351 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.351 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.351 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.351 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.352 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.352 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.352 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.352 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.352 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.352 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.352 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.352 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.352 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.353 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.353 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.353 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.353 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.353 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.353 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.353 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.354 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.354 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.354 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.354 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.354 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.354 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.354 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.354 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.354 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.355 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.355 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.355 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.355 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.355 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.355 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.355 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.355 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.355 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.356 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.356 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.356 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.356 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.356 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.356 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.356 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.356 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.356 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.357 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.357 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.357 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.357 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.357 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.357 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.357 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.357 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.358 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.358 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.358 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.358 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.358 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.358 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.358 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.358 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.358 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.359 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.359 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.359 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.359 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.359 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.359 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.359 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.359 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.360 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.360 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.360 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.360 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.360 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.360 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.360 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.360 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.360 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.361 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.361 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.361 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.361 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.361 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.361 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.361 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.361 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.361 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.362 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.362 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.362 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.362 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.362 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.362 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.362 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.362 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.362 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.363 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.363 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.363 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.363 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.363 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.363 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.363 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.363 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.363 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.364 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.364 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.364 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.364 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.364 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.364 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.364 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.364 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.364 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.365 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.365 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.365 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.365 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.365 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.365 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.365 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.365 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.366 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.366 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.366 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.366 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.366 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.366 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.366 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.366 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.367 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.367 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.367 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.367 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.367 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.367 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.367 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.367 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.368 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.368 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.368 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.368 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.368 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.368 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.368 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.368 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.369 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.369 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.369 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.369 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.369 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.369 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.369 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.370 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.370 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.370 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.370 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.370 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.370 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.370 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.370 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.371 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.371 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.371 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.371 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.371 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.371 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.371 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.371 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.371 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.372 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.372 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.372 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.372 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.372 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.372 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.372 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.372 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.372 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.373 2 DEBUG oslo_service.backend._eventlet.service [None req-ce5e3483-5fcb-4c97-b1b2-321181828e6e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.373 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251002161230.cc74260.el10)
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.890 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.905 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f9b16543f50> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Oct 06 13:48:35 compute-0 nova_compute[192903]: libvirt:  error : internal error: could not initialize domain event timer
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.907 2 WARNING nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.908 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f9b16543f50> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.911 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.912 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.912 2 INFO nova.utils [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] The default thread pool MainProcess.default is initialized
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.913 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.914 2 INFO nova.virt.libvirt.driver [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Connection event '1' reason 'None'
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.921 2 INFO nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Libvirt host capabilities <capabilities>
Oct 06 13:48:35 compute-0 nova_compute[192903]: 
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <host>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <uuid>3bc25ed8-3249-4f45-b283-3dd869d73ce5</uuid>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <cpu>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <arch>x86_64</arch>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model>EPYC-Rome-v4</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <vendor>AMD</vendor>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <microcode version='16777317'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <signature family='23' model='49' stepping='0'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='x2apic'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='tsc-deadline'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='osxsave'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='hypervisor'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='tsc_adjust'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='spec-ctrl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='stibp'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='arch-capabilities'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='ssbd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='cmp_legacy'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='topoext'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='virt-ssbd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='lbrv'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='tsc-scale'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='vmcb-clean'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='pause-filter'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='pfthreshold'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='svme-addr-chk'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='rdctl-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='skip-l1dfl-vmentry'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='mds-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature name='pschange-mc-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <pages unit='KiB' size='4'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <pages unit='KiB' size='2048'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <pages unit='KiB' size='1048576'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </cpu>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <power_management>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <suspend_mem/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <suspend_disk/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <suspend_hybrid/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </power_management>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <iommu support='no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <migration_features>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <live/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <uri_transports>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <uri_transport>tcp</uri_transport>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <uri_transport>rdma</uri_transport>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </uri_transports>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </migration_features>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <topology>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <cells num='1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <cell id='0'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:           <memory unit='KiB'>7864096</memory>
Oct 06 13:48:35 compute-0 nova_compute[192903]:           <pages unit='KiB' size='4'>1966024</pages>
Oct 06 13:48:35 compute-0 nova_compute[192903]:           <pages unit='KiB' size='2048'>0</pages>
Oct 06 13:48:35 compute-0 nova_compute[192903]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 06 13:48:35 compute-0 nova_compute[192903]:           <distances>
Oct 06 13:48:35 compute-0 nova_compute[192903]:             <sibling id='0' value='10'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:           </distances>
Oct 06 13:48:35 compute-0 nova_compute[192903]:           <cpus num='8'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:           </cpus>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         </cell>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </cells>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </topology>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <cache>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </cache>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <secmodel>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model>selinux</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <doi>0</doi>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </secmodel>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <secmodel>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model>dac</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <doi>0</doi>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </secmodel>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   </host>
Oct 06 13:48:35 compute-0 nova_compute[192903]: 
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <guest>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <os_type>hvm</os_type>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <arch name='i686'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <wordsize>32</wordsize>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <domain type='qemu'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <domain type='kvm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </arch>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <features>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <pae/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <nonpae/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <acpi default='on' toggle='yes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <apic default='on' toggle='no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <cpuselection/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <deviceboot/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <disksnapshot default='on' toggle='no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <externalSnapshot/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </features>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   </guest>
Oct 06 13:48:35 compute-0 nova_compute[192903]: 
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <guest>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <os_type>hvm</os_type>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <arch name='x86_64'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <wordsize>64</wordsize>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <domain type='qemu'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <domain type='kvm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </arch>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <features>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <acpi default='on' toggle='yes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <apic default='on' toggle='no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <cpuselection/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <deviceboot/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <disksnapshot default='on' toggle='no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <externalSnapshot/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </features>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   </guest>
Oct 06 13:48:35 compute-0 nova_compute[192903]: 
Oct 06 13:48:35 compute-0 nova_compute[192903]: </capabilities>
Oct 06 13:48:35 compute-0 nova_compute[192903]: 
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.932 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Oct 06 13:48:35 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.963 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 06 13:48:35 compute-0 nova_compute[192903]: <domainCapabilities>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <path>/usr/libexec/qemu-kvm</path>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <domain>kvm</domain>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <arch>i686</arch>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <vcpu max='4096'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <iothreads supported='yes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <os supported='yes'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <enum name='firmware'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <loader supported='yes'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <value>rom</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <value>pflash</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <enum name='readonly'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <value>yes</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <value>no</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <enum name='secure'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <value>no</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </loader>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   </os>
Oct 06 13:48:35 compute-0 nova_compute[192903]:   <cpu>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <mode name='host-passthrough' supported='yes'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <enum name='hostPassthroughMigratable'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <value>on</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <value>off</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <mode name='maximum' supported='yes'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <enum name='maximumMigratable'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <value>on</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <value>off</value>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <mode name='host-model' supported='yes'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <vendor>AMD</vendor>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='x2apic'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc-deadline'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='hypervisor'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc_adjust'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='spec-ctrl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='stibp'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='arch-capabilities'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='ssbd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='cmp_legacy'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='overflow-recov'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='succor'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='ibrs'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='amd-ssbd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='virt-ssbd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='lbrv'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc-scale'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='vmcb-clean'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='flushbyasid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='pause-filter'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='pfthreshold'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='svme-addr-chk'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='rdctl-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='mds-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='pschange-mc-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='gds-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='require' name='rfds-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <feature policy='disable' name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:35 compute-0 nova_compute[192903]:     <mode name='custom' supported='yes'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Broadwell'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Broadwell-IBRS'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Broadwell-noTSX'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v3'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v4'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v3'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v4'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v5'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cooperlake'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cooperlake-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Cooperlake-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Denverton'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Denverton-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Denverton-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Denverton-v3'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Dhyana-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-Genoa'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='auto-ibrs'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-Genoa-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='auto-ibrs'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v3'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-v3'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='EPYC-v4'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx10'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx10-128'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx10-256'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx10-512'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Haswell'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Haswell-IBRS'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Haswell-noTSX'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Haswell-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Haswell-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Haswell-v3'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Haswell-v4'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-noTSX'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v3'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v4'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v5'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v6'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v7'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='IvyBridge'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-IBRS'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-v2'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='KnightsMill'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-4fmaps'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-4vnniw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512er'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512pf'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='KnightsMill-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-4fmaps'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-4vnniw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512er'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512pf'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Opteron_G4'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Opteron_G4-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Opteron_G5'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='tbm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='Opteron_G5-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='tbm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 06 13:48:35 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v1'>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:35 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SierraForest'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ne-convert'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cmpccxadd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SierraForest-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ne-convert'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cmpccxadd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='athlon'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='athlon-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='core2duo'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='core2duo-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='coreduo'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='coreduo-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='n270'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='n270-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='phenom'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='phenom-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </cpu>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <memoryBacking supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <enum name='sourceType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>file</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>anonymous</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>memfd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </memoryBacking>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <devices>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <disk supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='diskDevice'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>disk</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>cdrom</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>floppy</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>lun</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='bus'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>fdc</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>scsi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>sata</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-non-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </disk>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <graphics supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vnc</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>egl-headless</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>dbus</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </graphics>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <video supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='modelType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vga</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>cirrus</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>none</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>bochs</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ramfb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </video>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <hostdev supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='mode'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>subsystem</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='startupPolicy'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>default</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>mandatory</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>requisite</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>optional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='subsysType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pci</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>scsi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='capsType'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='pciBackend'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </hostdev>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <rng supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-non-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>random</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>egd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>builtin</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </rng>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <filesystem supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='driverType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>path</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>handle</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtiofs</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </filesystem>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <tpm supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tpm-tis</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tpm-crb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>emulator</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>external</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendVersion'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>2.0</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </tpm>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <redirdev supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='bus'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </redirdev>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <channel supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pty</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>unix</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </channel>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <crypto supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>qemu</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>builtin</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </crypto>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <interface supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>default</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>passt</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </interface>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <panic supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>isa</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>hyperv</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </panic>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </devices>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <features>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <gic supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <vmcoreinfo supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <genid supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <backingStoreInput supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <backup supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <async-teardown supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <ps2 supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <sev supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <sgx supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <hyperv supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='features'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>relaxed</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vapic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>spinlocks</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vpindex</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>runtime</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>synic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>stimer</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>reset</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vendor_id</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>frequencies</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>reenlightenment</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tlbflush</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ipi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>avic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>emsr_bitmap</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>xmm_input</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </hyperv>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <launchSecurity supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </features>
Oct 06 13:48:36 compute-0 nova_compute[192903]: </domainCapabilities>
Oct 06 13:48:36 compute-0 nova_compute[192903]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:35.970 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 06 13:48:36 compute-0 nova_compute[192903]: <domainCapabilities>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <path>/usr/libexec/qemu-kvm</path>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <domain>kvm</domain>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <arch>i686</arch>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <vcpu max='240'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <iothreads supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <os supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <enum name='firmware'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <loader supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>rom</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pflash</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='readonly'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>yes</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>no</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='secure'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>no</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </loader>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </os>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <cpu>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='host-passthrough' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='hostPassthroughMigratable'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>on</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>off</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='maximum' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='maximumMigratable'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>on</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>off</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='host-model' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <vendor>AMD</vendor>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='x2apic'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc-deadline'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='hypervisor'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc_adjust'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='spec-ctrl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='stibp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='arch-capabilities'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='ssbd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='cmp_legacy'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='overflow-recov'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='succor'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='ibrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='amd-ssbd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='virt-ssbd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='lbrv'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc-scale'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='vmcb-clean'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='flushbyasid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='pause-filter'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='pfthreshold'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='svme-addr-chk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='rdctl-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='mds-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='pschange-mc-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='gds-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='rfds-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='disable' name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='custom' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cooperlake'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cooperlake-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cooperlake-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Dhyana-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Genoa'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='auto-ibrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Genoa-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='auto-ibrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10-128'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10-256'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10-512'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v6'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v7'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='KnightsMill'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4fmaps'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4vnniw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512er'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512pf'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='KnightsMill-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4fmaps'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4vnniw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512er'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512pf'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G4-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tbm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G5-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tbm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SierraForest'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ne-convert'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cmpccxadd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SierraForest-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ne-convert'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cmpccxadd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='athlon'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='athlon-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='core2duo'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='core2duo-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='coreduo'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='coreduo-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='n270'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='n270-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='phenom'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='phenom-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </cpu>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <memoryBacking supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <enum name='sourceType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>file</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>anonymous</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>memfd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </memoryBacking>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <devices>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <disk supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='diskDevice'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>disk</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>cdrom</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>floppy</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>lun</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='bus'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ide</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>fdc</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>scsi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>sata</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-non-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </disk>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <graphics supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vnc</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>egl-headless</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>dbus</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </graphics>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <video supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='modelType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vga</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>cirrus</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>none</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>bochs</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ramfb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </video>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <hostdev supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='mode'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>subsystem</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='startupPolicy'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>default</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>mandatory</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>requisite</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>optional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='subsysType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pci</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>scsi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='capsType'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='pciBackend'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </hostdev>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <rng supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-non-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>random</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>egd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>builtin</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </rng>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <filesystem supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='driverType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>path</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>handle</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtiofs</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </filesystem>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <tpm supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tpm-tis</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tpm-crb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>emulator</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>external</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendVersion'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>2.0</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </tpm>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <redirdev supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='bus'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </redirdev>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <channel supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pty</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>unix</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </channel>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <crypto supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>qemu</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>builtin</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </crypto>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <interface supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>default</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>passt</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </interface>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <panic supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>isa</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>hyperv</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </panic>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </devices>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <features>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <gic supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <vmcoreinfo supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <genid supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <backingStoreInput supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <backup supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <async-teardown supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <ps2 supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <sev supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <sgx supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <hyperv supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='features'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>relaxed</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vapic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>spinlocks</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vpindex</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>runtime</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>synic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>stimer</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>reset</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vendor_id</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>frequencies</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>reenlightenment</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tlbflush</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ipi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>avic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>emsr_bitmap</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>xmm_input</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </hyperv>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <launchSecurity supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </features>
Oct 06 13:48:36 compute-0 nova_compute[192903]: </domainCapabilities>
Oct 06 13:48:36 compute-0 nova_compute[192903]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.021 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.030 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 06 13:48:36 compute-0 nova_compute[192903]: <domainCapabilities>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <path>/usr/libexec/qemu-kvm</path>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <domain>kvm</domain>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <arch>x86_64</arch>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <vcpu max='4096'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <iothreads supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <os supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <enum name='firmware'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>efi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <loader supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>rom</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pflash</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='readonly'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>yes</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>no</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='secure'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>yes</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>no</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </loader>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </os>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <cpu>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='host-passthrough' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='hostPassthroughMigratable'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>on</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>off</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='maximum' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='maximumMigratable'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>on</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>off</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='host-model' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <vendor>AMD</vendor>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='x2apic'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc-deadline'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='hypervisor'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc_adjust'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='spec-ctrl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='stibp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='arch-capabilities'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='ssbd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='cmp_legacy'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='overflow-recov'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='succor'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='ibrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='amd-ssbd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='virt-ssbd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='lbrv'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc-scale'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='vmcb-clean'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='flushbyasid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='pause-filter'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='pfthreshold'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='svme-addr-chk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='rdctl-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='mds-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='pschange-mc-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='gds-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='rfds-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='disable' name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='custom' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cooperlake'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cooperlake-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cooperlake-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Dhyana-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Genoa'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='auto-ibrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Genoa-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='auto-ibrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10-128'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10-256'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10-512'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v6'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v7'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='KnightsMill'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4fmaps'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4vnniw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512er'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512pf'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='KnightsMill-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4fmaps'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4vnniw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512er'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512pf'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G4-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tbm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G5-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tbm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SierraForest'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ne-convert'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cmpccxadd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SierraForest-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ne-convert'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cmpccxadd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='athlon'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='athlon-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='core2duo'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='core2duo-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='coreduo'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='coreduo-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='n270'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='n270-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='phenom'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='phenom-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </cpu>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <memoryBacking supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <enum name='sourceType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>file</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>anonymous</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>memfd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </memoryBacking>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <devices>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <disk supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='diskDevice'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>disk</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>cdrom</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>floppy</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>lun</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='bus'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>fdc</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>scsi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>sata</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-non-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </disk>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <graphics supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vnc</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>egl-headless</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>dbus</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </graphics>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <video supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='modelType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vga</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>cirrus</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>none</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>bochs</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ramfb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </video>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <hostdev supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='mode'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>subsystem</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='startupPolicy'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>default</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>mandatory</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>requisite</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>optional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='subsysType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pci</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>scsi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='capsType'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='pciBackend'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </hostdev>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <rng supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-non-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>random</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>egd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>builtin</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </rng>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <filesystem supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='driverType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>path</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>handle</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtiofs</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </filesystem>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <tpm supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tpm-tis</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tpm-crb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>emulator</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>external</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendVersion'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>2.0</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </tpm>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <redirdev supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='bus'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </redirdev>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <channel supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pty</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>unix</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </channel>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <crypto supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>qemu</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>builtin</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </crypto>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <interface supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>default</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>passt</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </interface>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <panic supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>isa</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>hyperv</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </panic>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </devices>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <features>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <gic supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <vmcoreinfo supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <genid supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <backingStoreInput supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <backup supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <async-teardown supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <ps2 supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <sev supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <sgx supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <hyperv supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='features'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>relaxed</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vapic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>spinlocks</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vpindex</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>runtime</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>synic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>stimer</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>reset</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vendor_id</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>frequencies</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>reenlightenment</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tlbflush</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ipi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>avic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>emsr_bitmap</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>xmm_input</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </hyperv>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <launchSecurity supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </features>
Oct 06 13:48:36 compute-0 nova_compute[192903]: </domainCapabilities>
Oct 06 13:48:36 compute-0 nova_compute[192903]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.087 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 06 13:48:36 compute-0 nova_compute[192903]: <domainCapabilities>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <path>/usr/libexec/qemu-kvm</path>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <domain>kvm</domain>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <arch>x86_64</arch>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <vcpu max='240'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <iothreads supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <os supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <enum name='firmware'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <loader supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>rom</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pflash</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='readonly'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>yes</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>no</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='secure'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>no</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </loader>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </os>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <cpu>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='host-passthrough' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='hostPassthroughMigratable'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>on</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>off</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='maximum' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='maximumMigratable'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>on</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>off</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='host-model' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <vendor>AMD</vendor>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='x2apic'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc-deadline'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='hypervisor'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc_adjust'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='spec-ctrl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='stibp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='arch-capabilities'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='ssbd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='cmp_legacy'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='overflow-recov'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='succor'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='ibrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='amd-ssbd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='virt-ssbd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='lbrv'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='tsc-scale'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='vmcb-clean'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='flushbyasid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='pause-filter'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='pfthreshold'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='svme-addr-chk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='rdctl-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='mds-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='pschange-mc-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='gds-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='require' name='rfds-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <feature policy='disable' name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <mode name='custom' supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Broadwell-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cascadelake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cooperlake'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cooperlake-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Cooperlake-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Denverton-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Dhyana-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Genoa'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='auto-ibrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Genoa-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='auto-ibrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Milan-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amd-psfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='no-nested-data-bp'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='null-sel-clr-base'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='stibp-always-on'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-Rome-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='EPYC-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='GraniteRapids-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10-128'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10-256'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx10-512'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='prefetchiti'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Haswell-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-noTSX'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v6'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Icelake-Server-v7'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='IvyBridge-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='KnightsMill'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4fmaps'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4vnniw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512er'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512pf'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='KnightsMill-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4fmaps'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-4vnniw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512er'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512pf'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G4-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tbm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Opteron_G5-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fma4'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tbm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xop'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SapphireRapids-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='amx-tile'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-bf16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-fp16'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512-vpopcntdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bitalg'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vbmi2'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrc'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fzrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='la57'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='taa-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='tsx-ldtrk'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xfd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SierraForest'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ne-convert'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cmpccxadd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='SierraForest-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ifma'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-ne-convert'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx-vnni-int8'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='bus-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cmpccxadd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fbsdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='fsrs'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ibrs-all'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mcdt-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pbrsb-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='psdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='sbdr-ssdp-no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='serialize'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vaes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='vpclmulqdq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Client-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='hle'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='rtm'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Skylake-Server-v5'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512bw'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512cd'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512dq'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512f'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='avx512vl'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='invpcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pcid'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='pku'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='mpx'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v2'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v3'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='core-capability'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='split-lock-detect'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='Snowridge-v4'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='cldemote'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='erms'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='gfni'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdir64b'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='movdiri'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='xsaves'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='athlon'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='athlon-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='core2duo'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='core2duo-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='coreduo'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='coreduo-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='n270'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='n270-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='ss'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='phenom'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <blockers model='phenom-v1'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnow'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <feature name='3dnowext'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </blockers>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </mode>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </cpu>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <memoryBacking supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <enum name='sourceType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>file</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>anonymous</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <value>memfd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </memoryBacking>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <devices>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <disk supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='diskDevice'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>disk</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>cdrom</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>floppy</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>lun</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='bus'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ide</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>fdc</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>scsi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>sata</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-non-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </disk>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <graphics supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vnc</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>egl-headless</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>dbus</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </graphics>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <video supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='modelType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vga</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>cirrus</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>none</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>bochs</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ramfb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </video>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <hostdev supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='mode'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>subsystem</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='startupPolicy'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>default</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>mandatory</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>requisite</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>optional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='subsysType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pci</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>scsi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='capsType'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='pciBackend'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </hostdev>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <rng supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtio-non-transitional</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>random</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>egd</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>builtin</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </rng>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <filesystem supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='driverType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>path</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>handle</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>virtiofs</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </filesystem>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <tpm supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tpm-tis</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tpm-crb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>emulator</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>external</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendVersion'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>2.0</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </tpm>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <redirdev supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='bus'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>usb</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </redirdev>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <channel supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>pty</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>unix</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </channel>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <crypto supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='type'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>qemu</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendModel'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>builtin</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </crypto>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <interface supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='backendType'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>default</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>passt</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </interface>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <panic supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='model'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>isa</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>hyperv</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </panic>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </devices>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   <features>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <gic supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <vmcoreinfo supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <genid supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <backingStoreInput supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <backup supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <async-teardown supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <ps2 supported='yes'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <sev supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <sgx supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <hyperv supported='yes'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       <enum name='features'>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>relaxed</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vapic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>spinlocks</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vpindex</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>runtime</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>synic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>stimer</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>reset</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>vendor_id</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>frequencies</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>reenlightenment</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>tlbflush</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>ipi</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>avic</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>emsr_bitmap</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:         <value>xmm_input</value>
Oct 06 13:48:36 compute-0 nova_compute[192903]:       </enum>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     </hyperv>
Oct 06 13:48:36 compute-0 nova_compute[192903]:     <launchSecurity supported='no'/>
Oct 06 13:48:36 compute-0 nova_compute[192903]:   </features>
Oct 06 13:48:36 compute-0 nova_compute[192903]: </domainCapabilities>
Oct 06 13:48:36 compute-0 nova_compute[192903]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.160 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.160 2 INFO nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Secure Boot support detected
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.168 2 INFO nova.virt.libvirt.driver [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.168 2 INFO nova.virt.libvirt.driver [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.345 2 DEBUG nova.virt.libvirt.driver [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.456 2 WARNING nova.virt.libvirt.driver [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.457 2 DEBUG nova.virt.libvirt.volume.mount [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 06 13:48:36 compute-0 nova_compute[192903]: 2025-10-06 13:48:36.866 2 INFO nova.virt.node [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Determined node identity 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from /var/lib/nova/compute_id
Oct 06 13:48:37 compute-0 nova_compute[192903]: 2025-10-06 13:48:37.427 2 WARNING nova.compute.manager [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Compute nodes ['603c9dc2-ee32-4e36-82be-dcfb995e2be1'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Oct 06 13:48:38 compute-0 nova_compute[192903]: 2025-10-06 13:48:38.596 2 INFO nova.compute.manager [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 06 13:48:39 compute-0 sshd-session[193227]: Accepted publickey for zuul from 192.168.122.30 port 56880 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 13:48:39 compute-0 systemd-logind[789]: New session 28 of user zuul.
Oct 06 13:48:39 compute-0 systemd[1]: Started Session 28 of User zuul.
Oct 06 13:48:39 compute-0 sshd-session[193227]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.637 2 WARNING nova.compute.manager [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.638 2 DEBUG oslo_concurrency.lockutils [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.638 2 DEBUG oslo_concurrency.lockutils [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.638 2 DEBUG oslo_concurrency.lockutils [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.639 2 DEBUG nova.compute.resource_tracker [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.795 2 WARNING nova.virt.libvirt.driver [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.796 2 DEBUG oslo_concurrency.processutils [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.835 2 DEBUG oslo_concurrency.processutils [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.835 2 DEBUG nova.compute.resource_tracker [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6199MB free_disk=73.50839614868164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.835 2 DEBUG oslo_concurrency.lockutils [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:48:39 compute-0 nova_compute[192903]: 2025-10-06 13:48:39.836 2 DEBUG oslo_concurrency.lockutils [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:48:40 compute-0 nova_compute[192903]: 2025-10-06 13:48:40.388 2 WARNING nova.compute.resource_tracker [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] No compute node record for compute-0.ctlplane.example.com:603c9dc2-ee32-4e36-82be-dcfb995e2be1: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 603c9dc2-ee32-4e36-82be-dcfb995e2be1 could not be found.
Oct 06 13:48:40 compute-0 python3.9[193381]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 06 13:48:40 compute-0 nova_compute[192903]: 2025-10-06 13:48:40.940 2 INFO nova.compute.resource_tracker [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 603c9dc2-ee32-4e36-82be-dcfb995e2be1
Oct 06 13:48:41 compute-0 sudo[193535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fimxyqdwynwaqkgcwxtgczubimjcelvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758521.147211-52-81818169838278/AnsiballZ_systemd_service.py'
Oct 06 13:48:41 compute-0 sudo[193535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:42 compute-0 python3.9[193537]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:48:42 compute-0 systemd[1]: Reloading.
Oct 06 13:48:42 compute-0 systemd-rc-local-generator[193562]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:48:42 compute-0 systemd-sysv-generator[193566]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:48:42 compute-0 sudo[193535]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:42 compute-0 nova_compute[192903]: 2025-10-06 13:48:42.568 2 DEBUG nova.compute.resource_tracker [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:48:42 compute-0 nova_compute[192903]: 2025-10-06 13:48:42.570 2 DEBUG nova.compute.resource_tracker [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:48:39 up 49 min,  0 user,  load average: 0.85, 0.89, 0.70\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:48:43 compute-0 nova_compute[192903]: 2025-10-06 13:48:43.014 2 INFO nova.scheduler.client.report [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] [req-2dc8f07f-4ecf-4194-9443-477f3651bb38] Created resource provider record via placement API for resource provider with UUID 603c9dc2-ee32-4e36-82be-dcfb995e2be1 and name compute-0.ctlplane.example.com.
Oct 06 13:48:43 compute-0 nova_compute[192903]: 2025-10-06 13:48:43.052 2 DEBUG nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct 06 13:48:43 compute-0 nova_compute[192903]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Oct 06 13:48:43 compute-0 nova_compute[192903]: 2025-10-06 13:48:43.053 2 INFO nova.virt.libvirt.host [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] kernel doesn't support AMD SEV
Oct 06 13:48:43 compute-0 nova_compute[192903]: 2025-10-06 13:48:43.053 2 DEBUG nova.compute.provider_tree [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 13:48:43 compute-0 nova_compute[192903]: 2025-10-06 13:48:43.054 2 DEBUG nova.virt.libvirt.driver [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 13:48:43 compute-0 python3.9[193723]: ansible-ansible.builtin.service_facts Invoked
Oct 06 13:48:43 compute-0 network[193740]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 06 13:48:43 compute-0 network[193741]: 'network-scripts' will be removed from distribution in near future.
Oct 06 13:48:43 compute-0 network[193742]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 06 13:48:43 compute-0 nova_compute[192903]: 2025-10-06 13:48:43.756 2 DEBUG nova.scheduler.client.report [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Updated inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Oct 06 13:48:43 compute-0 nova_compute[192903]: 2025-10-06 13:48:43.756 2 DEBUG nova.compute.provider_tree [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Updating resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 06 13:48:43 compute-0 nova_compute[192903]: 2025-10-06 13:48:43.756 2 DEBUG nova.compute.provider_tree [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 13:48:43 compute-0 nova_compute[192903]: 2025-10-06 13:48:43.942 2 DEBUG nova.compute.provider_tree [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Updating resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 06 13:48:44 compute-0 nova_compute[192903]: 2025-10-06 13:48:44.454 2 DEBUG nova.compute.resource_tracker [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:48:44 compute-0 nova_compute[192903]: 2025-10-06 13:48:44.455 2 DEBUG oslo_concurrency.lockutils [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.619s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:48:44 compute-0 nova_compute[192903]: 2025-10-06 13:48:44.455 2 DEBUG nova.service [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Oct 06 13:48:44 compute-0 nova_compute[192903]: 2025-10-06 13:48:44.629 2 DEBUG nova.service [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Oct 06 13:48:44 compute-0 nova_compute[192903]: 2025-10-06 13:48:44.630 2 DEBUG nova.servicegroup.drivers.db [None req-28ac25ee-12f1-46f8-8686-b9eb7cf55694 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Oct 06 13:48:48 compute-0 podman[193992]: 2025-10-06 13:48:48.121706629 +0000 UTC m=+0.058234782 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 06 13:48:48 compute-0 sudo[194050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqnivgvvlijlprfuolmqwatigaapgcdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758527.7837415-90-72787219314230/AnsiballZ_systemd_service.py'
Oct 06 13:48:48 compute-0 sudo[194050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:48 compute-0 podman[193991]: 2025-10-06 13:48:48.150936213 +0000 UTC m=+0.092174363 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true)
Oct 06 13:48:48 compute-0 python3.9[194057]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:48:48 compute-0 sudo[194050]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:48 compute-0 podman[194063]: 2025-10-06 13:48:48.530249239 +0000 UTC m=+0.064940133 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:48:49 compute-0 sudo[194233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cebpdzarmrptfokzfvnvgmlfpejeynxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758528.7714045-110-231175132464527/AnsiballZ_file.py'
Oct 06 13:48:49 compute-0 sudo[194233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:49 compute-0 python3.9[194235]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:48:49 compute-0 sudo[194233]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:49 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 13:48:50 compute-0 sudo[194386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qywqlaowxhouhblpeuhrrvnnshvkybmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758529.9430175-126-248033307315601/AnsiballZ_file.py'
Oct 06 13:48:50 compute-0 sudo[194386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:50 compute-0 python3.9[194388]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:48:50 compute-0 sudo[194386]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:51 compute-0 sudo[194538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkxudldrszxqrjejefxbwoityquupoii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758530.7525349-144-156430097415488/AnsiballZ_command.py'
Oct 06 13:48:51 compute-0 sudo[194538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:51 compute-0 python3.9[194540]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:48:51 compute-0 sudo[194538]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:52 compute-0 python3.9[194692]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 06 13:48:53 compute-0 sudo[194842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgdiefbcdelbbrjbqfumxzvavwpaepvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758532.7819254-180-29967110944387/AnsiballZ_systemd_service.py'
Oct 06 13:48:53 compute-0 sudo[194842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:53 compute-0 python3.9[194844]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:48:53 compute-0 systemd[1]: Reloading.
Oct 06 13:48:53 compute-0 systemd-rc-local-generator[194868]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:48:53 compute-0 systemd-sysv-generator[194872]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:48:53 compute-0 sudo[194842]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:54 compute-0 sudo[195030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpjanmqzyhjsohonctihwzmnkfrkvlfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758533.8634708-196-162302581948273/AnsiballZ_command.py'
Oct 06 13:48:54 compute-0 sudo[195030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:54 compute-0 python3.9[195032]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:48:54 compute-0 sudo[195030]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:55 compute-0 sudo[195199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgrvnanqfhawqkcbiuaihsxzmgfopwbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758534.7165468-214-110346775003945/AnsiballZ_file.py'
Oct 06 13:48:55 compute-0 sudo[195199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:55 compute-0 podman[195157]: 2025-10-06 13:48:55.114920393 +0000 UTC m=+0.070241338 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 13:48:55 compute-0 python3.9[195205]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:48:55 compute-0 sudo[195199]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:56 compute-0 python3.9[195356]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:48:57 compute-0 python3.9[195508]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:48:57 compute-0 python3.9[195629]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758536.7417684-246-39255702080348/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:48:58 compute-0 sudo[195779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqiydnwfqylsbhcuiidntjuhriyrbhut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758538.125386-276-218557135413827/AnsiballZ_group.py'
Oct 06 13:48:58 compute-0 sudo[195779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:58 compute-0 python3.9[195781]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct 06 13:48:58 compute-0 sudo[195779]: pam_unix(sudo:session): session closed for user root
Oct 06 13:48:59 compute-0 sudo[195931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isrhoznbgboqqhajnamiirrtnzcblumg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758539.2741387-298-141372461856963/AnsiballZ_getent.py'
Oct 06 13:48:59 compute-0 sudo[195931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:48:59 compute-0 python3.9[195933]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct 06 13:48:59 compute-0 sudo[195931]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:00 compute-0 sudo[196084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucmbrtyelcjhdwmokwovdmiyecmdfhup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758540.1785262-314-65270425345627/AnsiballZ_group.py'
Oct 06 13:49:00 compute-0 sudo[196084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:00 compute-0 python3.9[196086]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 06 13:49:00 compute-0 groupadd[196087]: group added to /etc/group: name=ceilometer, GID=42405
Oct 06 13:49:00 compute-0 groupadd[196087]: group added to /etc/gshadow: name=ceilometer
Oct 06 13:49:00 compute-0 groupadd[196087]: new group: name=ceilometer, GID=42405
Oct 06 13:49:00 compute-0 sudo[196084]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:01 compute-0 sudo[196242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlyfxzglqahwlojhojllszhrwikhqwea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758541.1331842-330-239919586641727/AnsiballZ_user.py'
Oct 06 13:49:01 compute-0 sudo[196242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:02 compute-0 python3.9[196244]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 06 13:49:02 compute-0 useradd[196246]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Oct 06 13:49:02 compute-0 useradd[196246]: add 'ceilometer' to group 'libvirt'
Oct 06 13:49:02 compute-0 useradd[196246]: add 'ceilometer' to shadow group 'libvirt'
Oct 06 13:49:02 compute-0 sudo[196242]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:03 compute-0 python3.9[196402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:04 compute-0 python3.9[196523]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759758543.0452778-382-234066689247910/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:05 compute-0 python3.9[196673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:05 compute-0 python3.9[196794]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759758544.4458964-382-118987996498215/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:06 compute-0 python3.9[196944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:07 compute-0 python3.9[197065]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759758545.768925-382-158421719971144/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:07 compute-0 python3.9[197215]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:49:08 compute-0 python3.9[197367]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:49:09 compute-0 python3.9[197519]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:09 compute-0 python3.9[197640]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758548.6871774-500-19740200370897/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:10 compute-0 python3.9[197790]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:11 compute-0 python3.9[197866]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:49:11.327 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:49:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:49:11.327 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:49:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:49:11.328 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:49:11 compute-0 python3.9[198017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:12 compute-0 python3.9[198138]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758551.275619-500-81408492552217/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=534ac0499fced69d301b4e24be69b67a350d2de7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:13 compute-0 python3.9[198288]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:13 compute-0 python3.9[198409]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758552.640784-500-251689430244104/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:14 compute-0 python3.9[198559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:15 compute-0 python3.9[198680]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758553.9345336-500-273592924157424/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:15 compute-0 nova_compute[192903]: 2025-10-06 13:49:15.632 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:16 compute-0 python3.9[198830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:16 compute-0 nova_compute[192903]: 2025-10-06 13:49:16.152 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:16 compute-0 python3.9[198951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758555.33465-500-67952948818911/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:17 compute-0 python3.9[199101]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:18 compute-0 python3.9[199222]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758556.837241-500-169325578863674/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:18 compute-0 podman[199358]: 2025-10-06 13:49:18.663654988 +0000 UTC m=+0.080229979 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 13:49:18 compute-0 podman[199355]: 2025-10-06 13:49:18.663792652 +0000 UTC m=+0.078798730 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:49:18 compute-0 podman[199347]: 2025-10-06 13:49:18.695736639 +0000 UTC m=+0.124199042 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 13:49:18 compute-0 python3.9[199399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:19 compute-0 python3.9[199554]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758558.227903-500-101036761349679/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:20 compute-0 python3.9[199704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:20 compute-0 python3.9[199825]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758559.5786657-500-244151899137669/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:21 compute-0 python3.9[199975]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:22 compute-0 python3.9[200096]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758561.14883-500-183637241463905/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:23 compute-0 python3.9[200246]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:23 compute-0 python3.9[200367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758562.518015-500-235677134222082/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:24 compute-0 python3.9[200517]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:24 compute-0 python3.9[200593]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:25 compute-0 podman[200717]: 2025-10-06 13:49:25.760146397 +0000 UTC m=+0.074374990 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 13:49:25 compute-0 python3.9[200758]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:26 compute-0 python3.9[200839]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:27 compute-0 python3.9[200989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:27 compute-0 python3.9[201065]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:28 compute-0 sudo[201215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxrzgnfbqczjzqskbgztoexrlyydsuux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758567.7918057-878-238455085283716/AnsiballZ_file.py'
Oct 06 13:49:28 compute-0 sudo[201215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:28 compute-0 python3.9[201217]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:28 compute-0 sudo[201215]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:28 compute-0 sudo[201367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwfzaqfrtzawykpsltorjyynlbolulfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758568.4952264-894-241290303856273/AnsiballZ_file.py'
Oct 06 13:49:28 compute-0 sudo[201367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:28 compute-0 python3.9[201369]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:29 compute-0 sudo[201367]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:29 compute-0 sudo[201519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubdecwbpcuymqicennctumnprxcbeivb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758569.187355-910-130184805428408/AnsiballZ_file.py'
Oct 06 13:49:29 compute-0 sudo[201519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:29 compute-0 python3.9[201521]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:49:29 compute-0 sudo[201519]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:30 compute-0 sudo[201672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijfhcmqyphzvvrixfzcktemjhdzyklze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758569.9058719-926-123574493091932/AnsiballZ_systemd_service.py'
Oct 06 13:49:30 compute-0 sudo[201672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:30 compute-0 python3.9[201674]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:49:30 compute-0 systemd[1]: Reloading.
Oct 06 13:49:30 compute-0 systemd-rc-local-generator[201705]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:49:30 compute-0 systemd-sysv-generator[201710]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:49:31 compute-0 systemd[1]: Listening on Podman API Socket.
Oct 06 13:49:31 compute-0 sudo[201672]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:31 compute-0 sudo[201865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgnlcqfnaxmkbbabhbdtyrdawwwvrrzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758571.4468231-944-12350535040562/AnsiballZ_stat.py'
Oct 06 13:49:31 compute-0 sudo[201865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:31 compute-0 python3.9[201867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:31 compute-0 sudo[201865]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:32 compute-0 sudo[201988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtaeqrjiavjtnrziiiicdqtinvnerift ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758571.4468231-944-12350535040562/AnsiballZ_copy.py'
Oct 06 13:49:32 compute-0 sudo[201988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:32 compute-0 python3.9[201990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758571.4468231-944-12350535040562/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:49:32 compute-0 sudo[201988]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:33 compute-0 sudo[202140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujtlwamfuuaakulxrtkvlojoedjgttlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758572.8780587-978-22907713337755/AnsiballZ_container_config_data.py'
Oct 06 13:49:33 compute-0 sudo[202140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:33 compute-0 python3.9[202142]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct 06 13:49:33 compute-0 nova_compute[192903]: 2025-10-06 13:49:33.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:33 compute-0 nova_compute[192903]: 2025-10-06 13:49:33.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:33 compute-0 nova_compute[192903]: 2025-10-06 13:49:33.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:33 compute-0 nova_compute[192903]: 2025-10-06 13:49:33.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:33 compute-0 nova_compute[192903]: 2025-10-06 13:49:33.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:33 compute-0 nova_compute[192903]: 2025-10-06 13:49:33.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:33 compute-0 nova_compute[192903]: 2025-10-06 13:49:33.585 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:33 compute-0 nova_compute[192903]: 2025-10-06 13:49:33.585 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:49:33 compute-0 nova_compute[192903]: 2025-10-06 13:49:33.585 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:49:33 compute-0 sudo[202140]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.133 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.134 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.134 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.135 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:49:34 compute-0 sudo[202292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tucncxtxtsrthlbybqpfrditmeecndrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758573.8202143-996-274122782726188/AnsiballZ_container_config_hash.py'
Oct 06 13:49:34 compute-0 sudo[202292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.370 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.372 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.410 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.412 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6155MB free_disk=73.5077018737793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.412 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:49:34 compute-0 nova_compute[192903]: 2025-10-06 13:49:34.413 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:49:34 compute-0 python3.9[202294]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 06 13:49:34 compute-0 sudo[202292]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:35 compute-0 nova_compute[192903]: 2025-10-06 13:49:35.491 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:49:35 compute-0 nova_compute[192903]: 2025-10-06 13:49:35.492 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:49:34 up 50 min,  0 user,  load average: 0.67, 0.83, 0.69\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:49:35 compute-0 nova_compute[192903]: 2025-10-06 13:49:35.516 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:49:35 compute-0 sudo[202445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-warlmblygijuayfndwbftjbzgbusspfx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759758575.1282756-1016-2956785617872/AnsiballZ_edpm_container_manage.py'
Oct 06 13:49:35 compute-0 sudo[202445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:35 compute-0 python3[202447]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 06 13:49:36 compute-0 nova_compute[192903]: 2025-10-06 13:49:36.027 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:49:36 compute-0 nova_compute[192903]: 2025-10-06 13:49:36.539 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:49:36 compute-0 nova_compute[192903]: 2025-10-06 13:49:36.540 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:49:37 compute-0 podman[202460]: 2025-10-06 13:49:37.396074747 +0000 UTC m=+1.355075834 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct 06 13:49:37 compute-0 podman[202558]: 2025-10-06 13:49:37.599828141 +0000 UTC m=+0.067921466 container create fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Oct 06 13:49:37 compute-0 podman[202558]: 2025-10-06 13:49:37.565575485 +0000 UTC m=+0.033668890 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct 06 13:49:37 compute-0 python3[202447]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct 06 13:49:37 compute-0 sudo[202445]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:38 compute-0 sudo[202746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zytdcgevtiulzskgppmkecuicfioabcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758577.9817994-1032-216470561460651/AnsiballZ_stat.py'
Oct 06 13:49:38 compute-0 sudo[202746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:38 compute-0 python3.9[202748]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:49:38 compute-0 sudo[202746]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:39 compute-0 sudo[202900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkwwbsgmkjextzifpgftowypkrrpikvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758578.8501296-1050-64550525822566/AnsiballZ_file.py'
Oct 06 13:49:39 compute-0 sudo[202900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:39 compute-0 python3.9[202902]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:39 compute-0 sudo[202900]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:40 compute-0 sudo[203051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mprqmaccorexxexjawdzzbdwwzfuywlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758579.650431-1050-269148863653814/AnsiballZ_copy.py'
Oct 06 13:49:40 compute-0 sudo[203051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:40 compute-0 python3.9[203053]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759758579.650431-1050-269148863653814/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:40 compute-0 sudo[203051]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:40 compute-0 sudo[203127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chfbodqjbunsccdsqaotarkdmiffwmsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758579.650431-1050-269148863653814/AnsiballZ_systemd.py'
Oct 06 13:49:40 compute-0 sudo[203127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:41 compute-0 python3.9[203129]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:49:41 compute-0 systemd[1]: Reloading.
Oct 06 13:49:41 compute-0 systemd-rc-local-generator[203158]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:49:41 compute-0 systemd-sysv-generator[203162]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:49:41 compute-0 sudo[203127]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:41 compute-0 sudo[203240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlsumznpagpxupmgalvzsdyaxludqjar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758579.650431-1050-269148863653814/AnsiballZ_systemd.py'
Oct 06 13:49:41 compute-0 sudo[203240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:42 compute-0 python3.9[203242]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:49:42 compute-0 systemd[1]: Reloading.
Oct 06 13:49:42 compute-0 systemd-sysv-generator[203275]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:49:42 compute-0 systemd-rc-local-generator[203269]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:49:42 compute-0 systemd[1]: Starting podman_exporter container...
Oct 06 13:49:42 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:49:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ee005450d98cf6d7a65442de67452182c5b159b438ebf2fb0740a63d3c10eac/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 06 13:49:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ee005450d98cf6d7a65442de67452182c5b159b438ebf2fb0740a63d3c10eac/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 06 13:49:42 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12.
Oct 06 13:49:42 compute-0 podman[203282]: 2025-10-06 13:49:42.601844871 +0000 UTC m=+0.168298977 container init fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:49:42 compute-0 podman_exporter[203297]: ts=2025-10-06T13:49:42.625Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 06 13:49:42 compute-0 podman_exporter[203297]: ts=2025-10-06T13:49:42.625Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 06 13:49:42 compute-0 podman_exporter[203297]: ts=2025-10-06T13:49:42.625Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 06 13:49:42 compute-0 podman_exporter[203297]: ts=2025-10-06T13:49:42.625Z caller=handler.go:105 level=info collector=container
Oct 06 13:49:42 compute-0 podman[203282]: 2025-10-06 13:49:42.641171365 +0000 UTC m=+0.207625461 container start fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:49:42 compute-0 podman[203282]: podman_exporter
Oct 06 13:49:42 compute-0 systemd[1]: Starting Podman API Service...
Oct 06 13:49:42 compute-0 systemd[1]: Started podman_exporter container.
Oct 06 13:49:42 compute-0 systemd[1]: Started Podman API Service.
Oct 06 13:49:42 compute-0 podman[203308]: time="2025-10-06T13:49:42Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct 06 13:49:42 compute-0 podman[203308]: time="2025-10-06T13:49:42Z" level=info msg="Setting parallel job count to 25"
Oct 06 13:49:42 compute-0 podman[203308]: time="2025-10-06T13:49:42Z" level=info msg="Using sqlite as database backend"
Oct 06 13:49:42 compute-0 podman[203308]: time="2025-10-06T13:49:42Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct 06 13:49:42 compute-0 podman[203308]: time="2025-10-06T13:49:42Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct 06 13:49:42 compute-0 podman[203308]: time="2025-10-06T13:49:42Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct 06 13:49:42 compute-0 sudo[203240]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:42 compute-0 podman[203308]: @ - - [06/Oct/2025:13:49:42 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 06 13:49:42 compute-0 podman[203308]: time="2025-10-06T13:49:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:49:42 compute-0 podman[203308]: @ - - [06/Oct/2025:13:49:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16549 "" "Go-http-client/1.1"
Oct 06 13:49:42 compute-0 podman_exporter[203297]: ts=2025-10-06T13:49:42.741Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 06 13:49:42 compute-0 podman_exporter[203297]: ts=2025-10-06T13:49:42.743Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 06 13:49:42 compute-0 podman_exporter[203297]: ts=2025-10-06T13:49:42.744Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 06 13:49:42 compute-0 podman[203306]: 2025-10-06 13:49:42.747788886 +0000 UTC m=+0.086937465 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:49:42 compute-0 systemd[1]: fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12-7f55a82a507c194d.service: Main process exited, code=exited, status=1/FAILURE
Oct 06 13:49:42 compute-0 systemd[1]: fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12-7f55a82a507c194d.service: Failed with result 'exit-code'.
Oct 06 13:49:43 compute-0 sudo[203493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvqhttqwxyfggfekpedqsxbzqvlpqcfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758582.9470024-1098-104088101657284/AnsiballZ_systemd.py'
Oct 06 13:49:43 compute-0 sudo[203493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:43 compute-0 python3.9[203495]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:49:44 compute-0 systemd[1]: Stopping podman_exporter container...
Oct 06 13:49:44 compute-0 podman[203308]: @ - - [06/Oct/2025:13:49:42 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Oct 06 13:49:44 compute-0 systemd[1]: libpod-fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12.scope: Deactivated successfully.
Oct 06 13:49:44 compute-0 podman[203499]: 2025-10-06 13:49:44.794256969 +0000 UTC m=+0.053362477 container died fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:49:44 compute-0 systemd[1]: fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12-7f55a82a507c194d.timer: Deactivated successfully.
Oct 06 13:49:44 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12.
Oct 06 13:49:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12-userdata-shm.mount: Deactivated successfully.
Oct 06 13:49:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-6ee005450d98cf6d7a65442de67452182c5b159b438ebf2fb0740a63d3c10eac-merged.mount: Deactivated successfully.
Oct 06 13:49:44 compute-0 podman[203499]: 2025-10-06 13:49:44.974074489 +0000 UTC m=+0.233180017 container cleanup fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:49:44 compute-0 podman[203499]: podman_exporter
Oct 06 13:49:44 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 06 13:49:45 compute-0 podman[203528]: podman_exporter
Oct 06 13:49:45 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct 06 13:49:45 compute-0 systemd[1]: Stopped podman_exporter container.
Oct 06 13:49:45 compute-0 systemd[1]: Starting podman_exporter container...
Oct 06 13:49:45 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:49:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ee005450d98cf6d7a65442de67452182c5b159b438ebf2fb0740a63d3c10eac/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 06 13:49:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ee005450d98cf6d7a65442de67452182c5b159b438ebf2fb0740a63d3c10eac/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 06 13:49:45 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12.
Oct 06 13:49:45 compute-0 podman[203541]: 2025-10-06 13:49:45.282696737 +0000 UTC m=+0.166912159 container init fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:49:45 compute-0 podman_exporter[203557]: ts=2025-10-06T13:49:45.312Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 06 13:49:45 compute-0 podman_exporter[203557]: ts=2025-10-06T13:49:45.312Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 06 13:49:45 compute-0 podman_exporter[203557]: ts=2025-10-06T13:49:45.312Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 06 13:49:45 compute-0 podman_exporter[203557]: ts=2025-10-06T13:49:45.312Z caller=handler.go:105 level=info collector=container
Oct 06 13:49:45 compute-0 podman[203308]: @ - - [06/Oct/2025:13:49:45 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 06 13:49:45 compute-0 podman[203308]: time="2025-10-06T13:49:45Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:49:45 compute-0 podman[203541]: 2025-10-06 13:49:45.318233737 +0000 UTC m=+0.202449119 container start fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:49:45 compute-0 podman[203541]: podman_exporter
Oct 06 13:49:45 compute-0 systemd[1]: Started podman_exporter container.
Oct 06 13:49:45 compute-0 podman[203308]: @ - - [06/Oct/2025:13:49:45 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16551 "" "Go-http-client/1.1"
Oct 06 13:49:45 compute-0 podman_exporter[203557]: ts=2025-10-06T13:49:45.350Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 06 13:49:45 compute-0 podman_exporter[203557]: ts=2025-10-06T13:49:45.351Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 06 13:49:45 compute-0 podman_exporter[203557]: ts=2025-10-06T13:49:45.352Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 06 13:49:45 compute-0 sudo[203493]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:45 compute-0 podman[203566]: 2025-10-06 13:49:45.430326678 +0000 UTC m=+0.093431912 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:49:45 compute-0 sudo[203740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgwhsjaeejuqaxuprjwksjjvwchcqfsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758585.600753-1114-91643307560575/AnsiballZ_stat.py'
Oct 06 13:49:45 compute-0 sudo[203740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:46 compute-0 python3.9[203742]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:49:46 compute-0 sudo[203740]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:46 compute-0 sudo[203863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuuffhqjaoitziaxxltoxytnpcqickzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758585.600753-1114-91643307560575/AnsiballZ_copy.py'
Oct 06 13:49:46 compute-0 sudo[203863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:46 compute-0 python3.9[203865]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759758585.600753-1114-91643307560575/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 06 13:49:46 compute-0 sudo[203863]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:47 compute-0 sudo[204015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzuvwqpddejdugflhxsaqhqqujfxbrpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758587.1559103-1148-258840574421003/AnsiballZ_container_config_data.py'
Oct 06 13:49:47 compute-0 sudo[204015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:47 compute-0 python3.9[204017]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct 06 13:49:47 compute-0 sudo[204015]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:48 compute-0 sudo[204167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdrjpdufnwkfhyoatcefjzwamyjwldat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758587.946427-1166-192035140041494/AnsiballZ_container_config_hash.py'
Oct 06 13:49:48 compute-0 sudo[204167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:48 compute-0 python3.9[204169]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 06 13:49:48 compute-0 sudo[204167]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:49 compute-0 podman[204248]: 2025-10-06 13:49:49.234889919 +0000 UTC m=+0.094099410 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 13:49:49 compute-0 podman[204247]: 2025-10-06 13:49:49.247178975 +0000 UTC m=+0.110159609 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930)
Oct 06 13:49:49 compute-0 podman[204246]: 2025-10-06 13:49:49.259279155 +0000 UTC m=+0.120734787 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 06 13:49:49 compute-0 sudo[204381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odzdeszaljcizeigfznfunzeztdhflmf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759758588.9423401-1186-258917764292731/AnsiballZ_edpm_container_manage.py'
Oct 06 13:49:49 compute-0 sudo[204381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:49 compute-0 python3[204383]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 06 13:49:52 compute-0 podman[204396]: 2025-10-06 13:49:52.964750421 +0000 UTC m=+3.144685863 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 06 13:49:53 compute-0 podman[204494]: 2025-10-06 13:49:53.125686065 +0000 UTC m=+0.052251618 container create 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6)
Oct 06 13:49:53 compute-0 podman[204494]: 2025-10-06 13:49:53.097121925 +0000 UTC m=+0.023687458 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 06 13:49:53 compute-0 python3[204383]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 06 13:49:53 compute-0 sudo[204381]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:53 compute-0 sudo[204683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlkharhevsxxbmsbhxtcqofkuhbfylwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758593.5108056-1202-220336350651109/AnsiballZ_stat.py'
Oct 06 13:49:53 compute-0 sudo[204683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:54 compute-0 python3.9[204685]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:49:54 compute-0 sudo[204683]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:54 compute-0 sudo[204837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aixolbiyeokkdupwhpwvkxcbsdwrpavw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758594.436677-1220-48405069532248/AnsiballZ_file.py'
Oct 06 13:49:54 compute-0 sudo[204837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:54 compute-0 python3.9[204839]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:55 compute-0 sudo[204837]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:55 compute-0 sudo[204988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqscqfxqrsgduengymwelgzkobayccrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758595.075098-1220-243418137997236/AnsiballZ_copy.py'
Oct 06 13:49:55 compute-0 sudo[204988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:55 compute-0 python3.9[204990]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759758595.075098-1220-243418137997236/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:49:55 compute-0 sudo[204988]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:56 compute-0 sudo[205076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tblvqvjlvcokjuutnmitchnvxmzhchbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758595.075098-1220-243418137997236/AnsiballZ_systemd.py'
Oct 06 13:49:56 compute-0 sudo[205076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:56 compute-0 podman[205038]: 2025-10-06 13:49:56.116509555 +0000 UTC m=+0.108167824 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 06 13:49:56 compute-0 python3.9[205082]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 06 13:49:56 compute-0 systemd[1]: Reloading.
Oct 06 13:49:56 compute-0 systemd-sysv-generator[205119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:49:56 compute-0 systemd-rc-local-generator[205115]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:49:56 compute-0 sudo[205076]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:57 compute-0 sudo[205196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iukdedkhefmwxsydizrbsjqzfgeygmsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758595.075098-1220-243418137997236/AnsiballZ_systemd.py'
Oct 06 13:49:57 compute-0 sudo[205196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:57 compute-0 python3.9[205198]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 06 13:49:57 compute-0 systemd[1]: Reloading.
Oct 06 13:49:57 compute-0 systemd-rc-local-generator[205229]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 06 13:49:57 compute-0 systemd-sysv-generator[205233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 06 13:49:57 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 06 13:49:57 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:49:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39f8d660f2fc4ea7c4f1dddead4a088dc3a409f3cb9e6f777a34215e438a6771/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 06 13:49:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39f8d660f2fc4ea7c4f1dddead4a088dc3a409f3cb9e6f777a34215e438a6771/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 06 13:49:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39f8d660f2fc4ea7c4f1dddead4a088dc3a409f3cb9e6f777a34215e438a6771/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 06 13:49:58 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79.
Oct 06 13:49:58 compute-0 podman[205238]: 2025-10-06 13:49:58.038310354 +0000 UTC m=+0.167513046 container init 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, name=ubi9-minimal)
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *bridge.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *coverage.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *datapath.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *iface.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *memory.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *ovnnorthd.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *ovn.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *ovsdbserver.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *pmd_perf.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *pmd_rxq.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: INFO    13:49:58 main.go:48: registering *vswitch.Collector
Oct 06 13:49:58 compute-0 openstack_network_exporter[205254]: NOTICE  13:49:58 main.go:76: listening on https://:9105/metrics
Oct 06 13:49:58 compute-0 podman[205238]: 2025-10-06 13:49:58.074133142 +0000 UTC m=+0.203335824 container start 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 06 13:49:58 compute-0 podman[205238]: openstack_network_exporter
Oct 06 13:49:58 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 06 13:49:58 compute-0 sudo[205196]: pam_unix(sudo:session): session closed for user root
Oct 06 13:49:58 compute-0 podman[205264]: 2025-10-06 13:49:58.215461501 +0000 UTC m=+0.118606020 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Oct 06 13:49:58 compute-0 sudo[205437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slaacaddghpeshkypnumapekppsgstch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758598.3578312-1268-149071527465470/AnsiballZ_systemd.py'
Oct 06 13:49:58 compute-0 sudo[205437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:49:59 compute-0 python3.9[205439]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 06 13:49:59 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Oct 06 13:49:59 compute-0 systemd[1]: libpod-105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79.scope: Deactivated successfully.
Oct 06 13:49:59 compute-0 conmon[205254]: conmon 105e21466dfc6c026906 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79.scope/container/memory.events
Oct 06 13:49:59 compute-0 podman[205443]: 2025-10-06 13:49:59.198329081 +0000 UTC m=+0.091672905 container died 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, config_id=edpm, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Oct 06 13:49:59 compute-0 systemd[1]: 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79-484f2ab29bc937f1.timer: Deactivated successfully.
Oct 06 13:49:59 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79.
Oct 06 13:49:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79-userdata-shm.mount: Deactivated successfully.
Oct 06 13:49:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-39f8d660f2fc4ea7c4f1dddead4a088dc3a409f3cb9e6f777a34215e438a6771-merged.mount: Deactivated successfully.
Oct 06 13:49:59 compute-0 podman[205443]: 2025-10-06 13:49:59.970846366 +0000 UTC m=+0.864190160 container cleanup 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git)
Oct 06 13:49:59 compute-0 podman[205443]: openstack_network_exporter
Oct 06 13:49:59 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 06 13:50:00 compute-0 podman[205472]: openstack_network_exporter
Oct 06 13:50:00 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct 06 13:50:00 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Oct 06 13:50:00 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 06 13:50:00 compute-0 systemd[1]: Started libcrun container.
Oct 06 13:50:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39f8d660f2fc4ea7c4f1dddead4a088dc3a409f3cb9e6f777a34215e438a6771/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 06 13:50:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39f8d660f2fc4ea7c4f1dddead4a088dc3a409f3cb9e6f777a34215e438a6771/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 06 13:50:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39f8d660f2fc4ea7c4f1dddead4a088dc3a409f3cb9e6f777a34215e438a6771/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 06 13:50:00 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79.
Oct 06 13:50:00 compute-0 podman[205485]: 2025-10-06 13:50:00.262443129 +0000 UTC m=+0.167124815 container init 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *bridge.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *coverage.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *datapath.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *iface.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *memory.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *ovnnorthd.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *ovn.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *ovsdbserver.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *pmd_perf.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *pmd_rxq.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: INFO    13:50:00 main.go:48: registering *vswitch.Collector
Oct 06 13:50:00 compute-0 openstack_network_exporter[205500]: NOTICE  13:50:00 main.go:76: listening on https://:9105/metrics
Oct 06 13:50:00 compute-0 podman[205485]: 2025-10-06 13:50:00.301291959 +0000 UTC m=+0.205973635 container start 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 06 13:50:00 compute-0 podman[205485]: openstack_network_exporter
Oct 06 13:50:00 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 06 13:50:00 compute-0 sudo[205437]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:00 compute-0 podman[205511]: 2025-10-06 13:50:00.409124834 +0000 UTC m=+0.089880595 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7)
Oct 06 13:50:00 compute-0 sudo[205681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztgmpmxntpdcsrotcvychqnwrsebykys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758600.5518591-1284-1467279537199/AnsiballZ_find.py'
Oct 06 13:50:00 compute-0 sudo[205681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:01 compute-0 python3.9[205683]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 06 13:50:01 compute-0 sudo[205681]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:02 compute-0 sudo[205833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foemqxttlisnnyeewqzxanvdljbakeeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758601.486595-1303-226345907882401/AnsiballZ_podman_container_info.py'
Oct 06 13:50:02 compute-0 sudo[205833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:02 compute-0 python3.9[205835]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct 06 13:50:02 compute-0 sudo[205833]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:03 compute-0 sudo[205998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noisoxkdnknbvbzaimviijeefsungsht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758602.5376685-1311-280196007532516/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:03 compute-0 sudo[205998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:03 compute-0 python3.9[206000]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:03 compute-0 systemd[1]: Started libpod-conmon-20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b.scope.
Oct 06 13:50:03 compute-0 podman[206001]: 2025-10-06 13:50:03.422592403 +0000 UTC m=+0.118421135 container exec 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:50:03 compute-0 podman[206001]: 2025-10-06 13:50:03.45581968 +0000 UTC m=+0.151648422 container exec_died 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Oct 06 13:50:03 compute-0 systemd[1]: libpod-conmon-20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b.scope: Deactivated successfully.
Oct 06 13:50:03 compute-0 sudo[205998]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:04 compute-0 sudo[206183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgwqkszgyuglnbvevygputbjnurnkiwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758603.7054946-1319-135459721478919/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:04 compute-0 sudo[206183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:04 compute-0 python3.9[206185]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:04 compute-0 systemd[1]: Started libpod-conmon-20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b.scope.
Oct 06 13:50:04 compute-0 podman[206186]: 2025-10-06 13:50:04.364557795 +0000 UTC m=+0.095486578 container exec 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Oct 06 13:50:04 compute-0 podman[206186]: 2025-10-06 13:50:04.399748306 +0000 UTC m=+0.130677089 container exec_died 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 13:50:04 compute-0 systemd[1]: libpod-conmon-20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b.scope: Deactivated successfully.
Oct 06 13:50:04 compute-0 sudo[206183]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:05 compute-0 sudo[206365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpbpbzuuztldcmzmehyzwoigkzgzdftx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758604.6683598-1327-10308276355320/AnsiballZ_file.py'
Oct 06 13:50:05 compute-0 sudo[206365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:05 compute-0 python3.9[206367]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:05 compute-0 sudo[206365]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:05 compute-0 sudo[206517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfupnmxdsznikqvjglpeblrgykfsdjzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758605.6482747-1336-40951618741980/AnsiballZ_podman_container_info.py'
Oct 06 13:50:05 compute-0 sudo[206517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:06 compute-0 python3.9[206519]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct 06 13:50:06 compute-0 sudo[206517]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:06 compute-0 sudo[206682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhewrelasimnpwycijlezshsubircqnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758606.504891-1344-177437083103809/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:06 compute-0 sudo[206682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:07 compute-0 python3.9[206684]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:07 compute-0 systemd[1]: Started libpod-conmon-d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9.scope.
Oct 06 13:50:07 compute-0 podman[206685]: 2025-10-06 13:50:07.192618621 +0000 UTC m=+0.102554851 container exec d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:50:07 compute-0 podman[206685]: 2025-10-06 13:50:07.197927546 +0000 UTC m=+0.107863736 container exec_died d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 13:50:07 compute-0 sudo[206682]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:07 compute-0 systemd[1]: libpod-conmon-d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9.scope: Deactivated successfully.
Oct 06 13:50:07 compute-0 sudo[206864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbkcbnzxwkkkpvfcpowytltrvpmccflb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758607.4588468-1352-156316955548110/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:07 compute-0 sudo[206864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:08 compute-0 python3.9[206866]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:08 compute-0 systemd[1]: Started libpod-conmon-d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9.scope.
Oct 06 13:50:08 compute-0 podman[206867]: 2025-10-06 13:50:08.137588386 +0000 UTC m=+0.088178099 container exec d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 06 13:50:08 compute-0 podman[206867]: 2025-10-06 13:50:08.172616482 +0000 UTC m=+0.123206175 container exec_died d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 06 13:50:08 compute-0 systemd[1]: libpod-conmon-d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9.scope: Deactivated successfully.
Oct 06 13:50:08 compute-0 sudo[206864]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:08 compute-0 sudo[207048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgjjwphhrycupjblqfhwmkpdicdjrnol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758608.4076278-1360-152026643185307/AnsiballZ_file.py'
Oct 06 13:50:08 compute-0 sudo[207048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:08 compute-0 python3.9[207050]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:09 compute-0 sudo[207048]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:09 compute-0 sudo[207200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyhmgtczbshbmizacpaqusvplyxueeot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758609.2524242-1369-41218958903095/AnsiballZ_podman_container_info.py'
Oct 06 13:50:09 compute-0 sudo[207200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:09 compute-0 python3.9[207202]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct 06 13:50:09 compute-0 sudo[207200]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:10 compute-0 sudo[207365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfzkjkvuyblqiqyhigoglzwpudkflbrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758610.2175927-1377-195789325995401/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:10 compute-0 sudo[207365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:10 compute-0 python3.9[207367]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:10 compute-0 systemd[1]: Started libpod-conmon-b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480.scope.
Oct 06 13:50:10 compute-0 podman[207368]: 2025-10-06 13:50:10.869776833 +0000 UTC m=+0.094467830 container exec b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid)
Oct 06 13:50:10 compute-0 podman[207368]: 2025-10-06 13:50:10.900086751 +0000 UTC m=+0.124777698 container exec_died b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 13:50:10 compute-0 systemd[1]: libpod-conmon-b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480.scope: Deactivated successfully.
Oct 06 13:50:10 compute-0 sudo[207365]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:50:11.329 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:50:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:50:11.330 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:50:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:50:11.330 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:50:11 compute-0 sudo[207550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byloudtydnvbexjpfdcibphlloumoxgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758611.1518753-1385-128176174750766/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:11 compute-0 sudo[207550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:11 compute-0 python3.9[207552]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:11 compute-0 systemd[1]: Started libpod-conmon-b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480.scope.
Oct 06 13:50:11 compute-0 podman[207553]: 2025-10-06 13:50:11.859183681 +0000 UTC m=+0.104576286 container exec b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4)
Oct 06 13:50:11 compute-0 podman[207553]: 2025-10-06 13:50:11.889762256 +0000 UTC m=+0.135154871 container exec_died b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 13:50:11 compute-0 sudo[207550]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:11 compute-0 systemd[1]: libpod-conmon-b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480.scope: Deactivated successfully.
Oct 06 13:50:12 compute-0 sudo[207734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idqiyclpmhbrmymkefdtakmockqvigcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758612.1408563-1393-44871581028084/AnsiballZ_file.py'
Oct 06 13:50:12 compute-0 sudo[207734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:12 compute-0 python3.9[207736]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:12 compute-0 sudo[207734]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:13 compute-0 sudo[207886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiqaqognnovezvyrsegzdposecozvvbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758612.9774144-1402-4331809152460/AnsiballZ_podman_container_info.py'
Oct 06 13:50:13 compute-0 sudo[207886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:13 compute-0 python3.9[207888]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct 06 13:50:13 compute-0 sudo[207886]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:14 compute-0 sudo[208051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djjfhvpxtrvkuphrcdqkkogftzulubab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758613.8092873-1410-148136304692986/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:14 compute-0 sudo[208051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:14 compute-0 python3.9[208053]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:14 compute-0 systemd[1]: Started libpod-conmon-afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9.scope.
Oct 06 13:50:14 compute-0 podman[208054]: 2025-10-06 13:50:14.480155932 +0000 UTC m=+0.104743761 container exec afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 06 13:50:14 compute-0 podman[208054]: 2025-10-06 13:50:14.511128518 +0000 UTC m=+0.135716267 container exec_died afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 06 13:50:14 compute-0 systemd[1]: libpod-conmon-afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9.scope: Deactivated successfully.
Oct 06 13:50:14 compute-0 sudo[208051]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:15 compute-0 sudo[208236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdvlxdljxeuubyosyahjcbrzibbrfmce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758614.7775025-1418-179193645243755/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:15 compute-0 sudo[208236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:15 compute-0 python3.9[208238]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:15 compute-0 systemd[1]: Started libpod-conmon-afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9.scope.
Oct 06 13:50:15 compute-0 podman[208239]: 2025-10-06 13:50:15.410408325 +0000 UTC m=+0.109266325 container exec afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 06 13:50:15 compute-0 podman[208239]: 2025-10-06 13:50:15.441399951 +0000 UTC m=+0.140257971 container exec_died afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:50:15 compute-0 systemd[1]: libpod-conmon-afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9.scope: Deactivated successfully.
Oct 06 13:50:15 compute-0 sudo[208236]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:15 compute-0 podman[208270]: 2025-10-06 13:50:15.625209739 +0000 UTC m=+0.092129436 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 13:50:16 compute-0 sudo[208442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwveiyhqysamuybwcqfwdvnmtedaisui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758615.7053945-1426-150224991902040/AnsiballZ_file.py'
Oct 06 13:50:16 compute-0 sudo[208442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:16 compute-0 python3.9[208444]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:16 compute-0 sudo[208442]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:16 compute-0 sudo[208594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnztgidoyldkwdkzounmznkyeohopcfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758616.5670784-1435-187729766041116/AnsiballZ_podman_container_info.py'
Oct 06 13:50:16 compute-0 sudo[208594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:17 compute-0 python3.9[208596]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct 06 13:50:17 compute-0 sudo[208594]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:17 compute-0 sudo[208759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oegnrvpxhoseydjzvlmmnyjwkzcfqndk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758617.4637597-1443-204300782414545/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:17 compute-0 sudo[208759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:18 compute-0 python3.9[208761]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:18 compute-0 systemd[1]: Started libpod-conmon-fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12.scope.
Oct 06 13:50:18 compute-0 podman[208762]: 2025-10-06 13:50:18.151873545 +0000 UTC m=+0.090453601 container exec fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 13:50:18 compute-0 podman[208762]: 2025-10-06 13:50:18.181625028 +0000 UTC m=+0.120205074 container exec_died fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:50:18 compute-0 systemd[1]: libpod-conmon-fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12.scope: Deactivated successfully.
Oct 06 13:50:18 compute-0 sudo[208759]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:18 compute-0 sudo[208944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcbdurmmvzdgutsbwvcbqbtuotmiyzkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758618.4507594-1451-40818753839952/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:18 compute-0 sudo[208944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:19 compute-0 python3.9[208946]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:19 compute-0 systemd[1]: Started libpod-conmon-fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12.scope.
Oct 06 13:50:19 compute-0 podman[208947]: 2025-10-06 13:50:19.164731673 +0000 UTC m=+0.107448264 container exec fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 13:50:19 compute-0 podman[208947]: 2025-10-06 13:50:19.195512793 +0000 UTC m=+0.138229404 container exec_died fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:50:19 compute-0 systemd[1]: libpod-conmon-fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12.scope: Deactivated successfully.
Oct 06 13:50:19 compute-0 sudo[208944]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:19 compute-0 sudo[209154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvbmodshusempdegmkuubirphlvhspvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758619.4491522-1459-27219544926399/AnsiballZ_file.py'
Oct 06 13:50:19 compute-0 sudo[209154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:19 compute-0 podman[209123]: 2025-10-06 13:50:19.772619763 +0000 UTC m=+0.076831220 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 06 13:50:19 compute-0 podman[209113]: 2025-10-06 13:50:19.784778335 +0000 UTC m=+0.097999227 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 13:50:19 compute-0 podman[209103]: 2025-10-06 13:50:19.789211706 +0000 UTC m=+0.107777114 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 13:50:19 compute-0 python3.9[209174]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:19 compute-0 sudo[209154]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:20 compute-0 sudo[209338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abhwanmwwfygtpravxqhxljizetxlzoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758620.1540675-1468-14512273745064/AnsiballZ_podman_container_info.py'
Oct 06 13:50:20 compute-0 sudo[209338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:20 compute-0 python3.9[209340]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct 06 13:50:20 compute-0 sudo[209338]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:21 compute-0 sudo[209503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auvuhbnxkszvmfsdkebxbiumdcmdykit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758620.972872-1476-70471389120897/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:21 compute-0 sudo[209503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:21 compute-0 python3.9[209505]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:21 compute-0 systemd[1]: Started libpod-conmon-105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79.scope.
Oct 06 13:50:21 compute-0 podman[209506]: 2025-10-06 13:50:21.666655133 +0000 UTC m=+0.116010308 container exec 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Oct 06 13:50:21 compute-0 podman[209506]: 2025-10-06 13:50:21.704391614 +0000 UTC m=+0.153746739 container exec_died 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Oct 06 13:50:21 compute-0 systemd[1]: libpod-conmon-105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79.scope: Deactivated successfully.
Oct 06 13:50:21 compute-0 sudo[209503]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:22 compute-0 sudo[209685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blislnjkgnucgvxceezaaaswuinuctbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758621.9691038-1484-268026574620985/AnsiballZ_podman_container_exec.py'
Oct 06 13:50:22 compute-0 sudo[209685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:22 compute-0 python3.9[209687]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 06 13:50:22 compute-0 systemd[1]: Started libpod-conmon-105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79.scope.
Oct 06 13:50:22 compute-0 podman[209688]: 2025-10-06 13:50:22.671088582 +0000 UTC m=+0.088615631 container exec 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Oct 06 13:50:22 compute-0 podman[209688]: 2025-10-06 13:50:22.701276546 +0000 UTC m=+0.118803585 container exec_died 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Oct 06 13:50:22 compute-0 systemd[1]: libpod-conmon-105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79.scope: Deactivated successfully.
Oct 06 13:50:22 compute-0 sudo[209685]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:23 compute-0 sudo[209868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rernfvlwfjyxhdycbpvixxliifexyvxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758622.964411-1492-49748825543175/AnsiballZ_file.py'
Oct 06 13:50:23 compute-0 sudo[209868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:23 compute-0 python3.9[209870]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:23 compute-0 sudo[209868]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:27 compute-0 podman[209895]: 2025-10-06 13:50:27.24350037 +0000 UTC m=+0.097934545 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:50:31 compute-0 podman[209915]: 2025-10-06 13:50:31.224755354 +0000 UTC m=+0.084321490 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6)
Oct 06 13:50:36 compute-0 nova_compute[192903]: 2025-10-06 13:50:36.533 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:50:36 compute-0 nova_compute[192903]: 2025-10-06 13:50:36.533 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.043 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.044 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.044 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.045 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.045 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.046 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.046 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.047 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.564 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.565 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.565 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.566 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.748 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.748 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.787 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.788 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6087MB free_disk=73.3383903503418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.788 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:50:37 compute-0 nova_compute[192903]: 2025-10-06 13:50:37.789 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:50:38 compute-0 nova_compute[192903]: 2025-10-06 13:50:38.836 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:50:38 compute-0 nova_compute[192903]: 2025-10-06 13:50:38.836 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:50:37 up 51 min,  0 user,  load average: 0.70, 0.82, 0.69\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:50:38 compute-0 nova_compute[192903]: 2025-10-06 13:50:38.912 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:50:39 compute-0 nova_compute[192903]: 2025-10-06 13:50:39.419 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:50:39 compute-0 nova_compute[192903]: 2025-10-06 13:50:39.928 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:50:39 compute-0 nova_compute[192903]: 2025-10-06 13:50:39.928 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.140s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:50:42 compute-0 sudo[210062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efpqaylmcfyznylunzgwlndjtpnzvgob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758642.0841892-1700-214943467001335/AnsiballZ_file.py'
Oct 06 13:50:42 compute-0 sudo[210062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:42 compute-0 python3.9[210064]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:42 compute-0 sudo[210062]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:43 compute-0 sudo[210214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tizfwojttgkbdspwpieldajizbahcwkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758642.7419572-1716-264119626140237/AnsiballZ_stat.py'
Oct 06 13:50:43 compute-0 sudo[210214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:43 compute-0 python3.9[210216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:50:43 compute-0 sudo[210214]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:43 compute-0 sudo[210337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycpkvqjvyfohojfzioqvgsujguvvdbes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758642.7419572-1716-264119626140237/AnsiballZ_copy.py'
Oct 06 13:50:43 compute-0 sudo[210337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:43 compute-0 python3.9[210339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759758642.7419572-1716-264119626140237/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:43 compute-0 sudo[210337]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:44 compute-0 sudo[210489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnsiwemyqbfahhgaetwalvezlmhltbyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758644.1396968-1748-277808228495229/AnsiballZ_file.py'
Oct 06 13:50:44 compute-0 sudo[210489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:44 compute-0 python3.9[210491]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:44 compute-0 sudo[210489]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:45 compute-0 sudo[210641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjekjuyyfcjjkeuoviioxevdytertsdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758644.7738106-1764-187272404845510/AnsiballZ_stat.py'
Oct 06 13:50:45 compute-0 sudo[210641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:45 compute-0 python3.9[210643]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:50:45 compute-0 sudo[210641]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:45 compute-0 sudo[210719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbtskfieomzrkeicdlkcakpsklzxcvzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758644.7738106-1764-187272404845510/AnsiballZ_file.py'
Oct 06 13:50:45 compute-0 sudo[210719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:45 compute-0 python3.9[210721]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:45 compute-0 sudo[210719]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:46 compute-0 sudo[210881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctjablyvqjqwwpyjivsejvxarqrsxzqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758645.8263004-1788-100826517612092/AnsiballZ_stat.py'
Oct 06 13:50:46 compute-0 sudo[210881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:46 compute-0 podman[210845]: 2025-10-06 13:50:46.11712361 +0000 UTC m=+0.058775479 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:50:46 compute-0 python3.9[210889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:50:46 compute-0 sudo[210881]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:46 compute-0 sudo[210973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpwdhcvhqsjoqlvbtlghntyzlxctitqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758645.8263004-1788-100826517612092/AnsiballZ_file.py'
Oct 06 13:50:46 compute-0 sudo[210973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:46 compute-0 python3.9[210975]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hm1gn4uh recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:46 compute-0 sudo[210973]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:47 compute-0 sudo[211125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alegbhftiuyynjddujfspzbhwkmdkyzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758646.9895403-1812-213991010875735/AnsiballZ_stat.py'
Oct 06 13:50:47 compute-0 sudo[211125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:47 compute-0 python3.9[211127]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:50:47 compute-0 sudo[211125]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:47 compute-0 sudo[211203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szxknnzfcsrjygwjhdzqjjmauslemkun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758646.9895403-1812-213991010875735/AnsiballZ_file.py'
Oct 06 13:50:47 compute-0 sudo[211203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:47 compute-0 python3.9[211205]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:47 compute-0 sudo[211203]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:48 compute-0 sudo[211355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhjaulamckyjkngbkxyrjuzvkvbbzksy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758648.1252675-1838-48430770830650/AnsiballZ_command.py'
Oct 06 13:50:48 compute-0 sudo[211355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:48 compute-0 python3.9[211357]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:50:48 compute-0 sudo[211355]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:49 compute-0 sudo[211508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xennuvcpslbjeulymsvmcqbifnjnexrg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759758648.8323042-1854-174392787868969/AnsiballZ_edpm_nftables_from_files.py'
Oct 06 13:50:49 compute-0 sudo[211508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:49 compute-0 python3[211510]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 06 13:50:49 compute-0 sudo[211508]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:50 compute-0 sudo[211683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcgjefbgasxwgxjoupbzsqkjgyowpgep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758649.7123148-1870-1457795848567/AnsiballZ_stat.py'
Oct 06 13:50:50 compute-0 sudo[211683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:50 compute-0 podman[211650]: 2025-10-06 13:50:50.05466572 +0000 UTC m=+0.057195237 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd)
Oct 06 13:50:50 compute-0 podman[211652]: 2025-10-06 13:50:50.072753499 +0000 UTC m=+0.071432211 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:50:50 compute-0 podman[211635]: 2025-10-06 13:50:50.129656497 +0000 UTC m=+0.134098265 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20250930)
Oct 06 13:50:50 compute-0 python3.9[211694]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:50:50 compute-0 sudo[211683]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:50 compute-0 sudo[211805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-detfunjnlxuyiowjcrbufnjitqlnuzzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758649.7123148-1870-1457795848567/AnsiballZ_file.py'
Oct 06 13:50:50 compute-0 sudo[211805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:50 compute-0 python3.9[211807]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:50 compute-0 sudo[211805]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:51 compute-0 sudo[211957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtrijslopezqxqtxvkuofloyitqlxgyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758650.8597486-1894-78886693058451/AnsiballZ_stat.py'
Oct 06 13:50:51 compute-0 sudo[211957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:51 compute-0 python3.9[211959]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:50:51 compute-0 sudo[211957]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:51 compute-0 sudo[212035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcmdhivbndciomyklxbkmmvhpcawfbmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758650.8597486-1894-78886693058451/AnsiballZ_file.py'
Oct 06 13:50:51 compute-0 sudo[212035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:51 compute-0 python3.9[212037]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:51 compute-0 sudo[212035]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:52 compute-0 sudo[212187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkpcpjcmlqbpgxwjemnfhcnvrrfgtkjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758652.0434566-1918-219724384389549/AnsiballZ_stat.py'
Oct 06 13:50:52 compute-0 sudo[212187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:52 compute-0 python3.9[212189]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:50:52 compute-0 sudo[212187]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:52 compute-0 sudo[212265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhzlahvxdzzgxwosdjuhrsngtgwlzuvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758652.0434566-1918-219724384389549/AnsiballZ_file.py'
Oct 06 13:50:52 compute-0 sudo[212265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:53 compute-0 python3.9[212267]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:53 compute-0 sudo[212265]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:53 compute-0 sudo[212417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ystnktkjrnycjqfjcxynrbyeytlyhdyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758653.2635043-1942-69255979397357/AnsiballZ_stat.py'
Oct 06 13:50:53 compute-0 sudo[212417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:53 compute-0 python3.9[212419]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:50:53 compute-0 sudo[212417]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:54 compute-0 sudo[212495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osbzezsujzifgdixceqtgenqiiiakzsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758653.2635043-1942-69255979397357/AnsiballZ_file.py'
Oct 06 13:50:54 compute-0 sudo[212495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:54 compute-0 python3.9[212497]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:54 compute-0 sudo[212495]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:55 compute-0 sudo[212647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmfyoejuizdlkouimrkepkmhfitmbhac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758654.561483-1966-58268015954361/AnsiballZ_stat.py'
Oct 06 13:50:55 compute-0 sudo[212647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:55 compute-0 python3.9[212649]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 06 13:50:55 compute-0 sudo[212647]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:55 compute-0 sudo[212772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymktahvhkdgpajcjpsjauttsmihntmvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758654.561483-1966-58268015954361/AnsiballZ_copy.py'
Oct 06 13:50:55 compute-0 sudo[212772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:55 compute-0 python3.9[212774]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759758654.561483-1966-58268015954361/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:55 compute-0 sudo[212772]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:56 compute-0 sudo[212924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dglhekraqxusfopphosnueqpajcnszmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758656.1413095-1996-107094711413301/AnsiballZ_file.py'
Oct 06 13:50:56 compute-0 sudo[212924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:56 compute-0 python3.9[212926]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:56 compute-0 sudo[212924]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:57 compute-0 sudo[213076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obucvhmtwmbhmamhooaljhpperikqqjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758656.8699627-2012-265856039314372/AnsiballZ_command.py'
Oct 06 13:50:57 compute-0 sudo[213076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:57 compute-0 python3.9[213078]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:50:57 compute-0 sudo[213076]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:58 compute-0 sudo[213244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewfpgmtroqqqvvjdeabltfemmkjaseju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758657.657484-2028-76919625938027/AnsiballZ_blockinfile.py'
Oct 06 13:50:58 compute-0 sudo[213244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:58 compute-0 podman[213205]: 2025-10-06 13:50:58.209537324 +0000 UTC m=+0.126709886 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 13:50:58 compute-0 python3.9[213253]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:50:58 compute-0 sudo[213244]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:59 compute-0 sudo[213404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzieggbudcoykywioritdpmfsprfsahz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758658.7239847-2046-78044997799505/AnsiballZ_command.py'
Oct 06 13:50:59 compute-0 sudo[213404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:50:59 compute-0 python3.9[213406]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:50:59 compute-0 sudo[213404]: pam_unix(sudo:session): session closed for user root
Oct 06 13:50:59 compute-0 sudo[213557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfcgsjrcsddbcqfnhzdmdzubxrfdjpbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758659.509378-2062-28398351847201/AnsiballZ_stat.py'
Oct 06 13:50:59 compute-0 sudo[213557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:51:00 compute-0 python3.9[213559]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 06 13:51:00 compute-0 sudo[213557]: pam_unix(sudo:session): session closed for user root
Oct 06 13:51:00 compute-0 sudo[213711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpjpdulkbimjvofuzafzsknepjwgdmyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758660.252559-2078-16674850704427/AnsiballZ_command.py'
Oct 06 13:51:00 compute-0 sudo[213711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:51:00 compute-0 python3.9[213713]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 06 13:51:00 compute-0 sudo[213711]: pam_unix(sudo:session): session closed for user root
Oct 06 13:51:01 compute-0 podman[213840]: 2025-10-06 13:51:01.349954418 +0000 UTC m=+0.061370640 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter)
Oct 06 13:51:01 compute-0 sudo[213887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evayvvhpqtdwgjfquxgjeykyoqsbfksi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759758661.053768-2094-218944770897567/AnsiballZ_file.py'
Oct 06 13:51:01 compute-0 sudo[213887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 13:51:01 compute-0 openstack_network_exporter[205500]: ERROR   13:51:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:51:01 compute-0 openstack_network_exporter[205500]: ERROR   13:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:51:01 compute-0 openstack_network_exporter[205500]: ERROR   13:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:51:01 compute-0 openstack_network_exporter[205500]: ERROR   13:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:51:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:51:01 compute-0 openstack_network_exporter[205500]: ERROR   13:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:51:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:51:01 compute-0 python3.9[213889]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 06 13:51:01 compute-0 sudo[213887]: pam_unix(sudo:session): session closed for user root
Oct 06 13:51:02 compute-0 sshd-session[193230]: Connection closed by 192.168.122.30 port 56880
Oct 06 13:51:02 compute-0 sshd-session[193227]: pam_unix(sshd:session): session closed for user zuul
Oct 06 13:51:02 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Oct 06 13:51:02 compute-0 systemd[1]: session-28.scope: Consumed 1min 37.120s CPU time.
Oct 06 13:51:02 compute-0 systemd-logind[789]: Session 28 logged out. Waiting for processes to exit.
Oct 06 13:51:02 compute-0 systemd-logind[789]: Removed session 28.
Oct 06 13:51:02 compute-0 podman[203308]: time="2025-10-06T13:51:02Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:51:02 compute-0 podman[203308]: @ - - [06/Oct/2025:13:51:02 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:51:02 compute-0 podman[203308]: @ - - [06/Oct/2025:13:51:02 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2978 "" "Go-http-client/1.1"
Oct 06 13:51:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:51:11.331 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:51:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:51:11.332 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:51:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:51:11.332 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:51:17 compute-0 podman[213919]: 2025-10-06 13:51:17.190158172 +0000 UTC m=+0.056464787 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 13:51:20 compute-0 podman[213944]: 2025-10-06 13:51:20.23085576 +0000 UTC m=+0.075074860 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 06 13:51:20 compute-0 podman[213943]: 2025-10-06 13:51:20.246918815 +0000 UTC m=+0.098090463 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 13:51:20 compute-0 podman[213945]: 2025-10-06 13:51:20.275485527 +0000 UTC m=+0.112147163 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4)
Oct 06 13:51:29 compute-0 podman[214008]: 2025-10-06 13:51:29.19582131 +0000 UTC m=+0.060710412 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 13:51:29 compute-0 podman[203308]: time="2025-10-06T13:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:51:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:51:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2976 "" "Go-http-client/1.1"
Oct 06 13:51:31 compute-0 openstack_network_exporter[205500]: ERROR   13:51:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:51:31 compute-0 openstack_network_exporter[205500]: ERROR   13:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:51:31 compute-0 openstack_network_exporter[205500]: ERROR   13:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:51:31 compute-0 openstack_network_exporter[205500]: ERROR   13:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:51:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:51:31 compute-0 openstack_network_exporter[205500]: ERROR   13:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:51:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:51:32 compute-0 podman[214029]: 2025-10-06 13:51:32.194712109 +0000 UTC m=+0.060387723 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm)
Oct 06 13:51:39 compute-0 nova_compute[192903]: 2025-10-06 13:51:39.930 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:51:39 compute-0 nova_compute[192903]: 2025-10-06 13:51:39.930 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:51:39 compute-0 nova_compute[192903]: 2025-10-06 13:51:39.930 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:51:39 compute-0 nova_compute[192903]: 2025-10-06 13:51:39.930 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:51:39 compute-0 nova_compute[192903]: 2025-10-06 13:51:39.931 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:51:39 compute-0 nova_compute[192903]: 2025-10-06 13:51:39.931 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:51:39 compute-0 nova_compute[192903]: 2025-10-06 13:51:39.931 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:51:39 compute-0 nova_compute[192903]: 2025-10-06 13:51:39.931 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:51:39 compute-0 nova_compute[192903]: 2025-10-06 13:51:39.931 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.445 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.446 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.446 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.447 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.650 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.653 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.674 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.675 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6170MB free_disk=73.33640670776367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.676 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:51:40 compute-0 nova_compute[192903]: 2025-10-06 13:51:40.676 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:51:41 compute-0 nova_compute[192903]: 2025-10-06 13:51:41.714 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:51:41 compute-0 nova_compute[192903]: 2025-10-06 13:51:41.714 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:51:40 up 52 min,  0 user,  load average: 0.35, 0.70, 0.66\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:51:41 compute-0 nova_compute[192903]: 2025-10-06 13:51:41.730 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:51:42 compute-0 nova_compute[192903]: 2025-10-06 13:51:42.237 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:51:42 compute-0 nova_compute[192903]: 2025-10-06 13:51:42.748 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:51:42 compute-0 nova_compute[192903]: 2025-10-06 13:51:42.749 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.073s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:51:48 compute-0 podman[214052]: 2025-10-06 13:51:48.202205064 +0000 UTC m=+0.065708037 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 13:51:51 compute-0 podman[214078]: 2025-10-06 13:51:51.227882017 +0000 UTC m=+0.074636978 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 13:51:51 compute-0 podman[214077]: 2025-10-06 13:51:51.23279244 +0000 UTC m=+0.087146677 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_id=multipathd)
Oct 06 13:51:51 compute-0 podman[214076]: 2025-10-06 13:51:51.281808455 +0000 UTC m=+0.134276571 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 06 13:51:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:51:59.606 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 13:51:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:51:59.607 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 13:51:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:51:59.608 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 13:52:00 compute-0 podman[214140]: 2025-10-06 13:52:00.214492311 +0000 UTC m=+0.087200658 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4)
Oct 06 13:52:01 compute-0 anacron[3714]: Job `cron.monthly' started
Oct 06 13:52:01 compute-0 anacron[3714]: Job `cron.monthly' terminated
Oct 06 13:52:01 compute-0 anacron[3714]: Normal exit (3 jobs run)
Oct 06 13:52:03 compute-0 podman[214162]: 2025-10-06 13:52:03.235530028 +0000 UTC m=+0.095869063 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 06 13:52:03 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:03.933 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:a4:f5 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-8084a6fc-60d0-43e5-ae6e-fe1d238d8993', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8084a6fc-60d0-43e5-ae6e-fe1d238d8993', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd142f68afa1489aa76784748e93db34', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=484aaff3-5193-40fa-bce4-222eb82eb85f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=19820d75-49b2-44d2-b7cb-67580d105e30) old=Port_Binding(mac=['fa:16:3e:f8:a4:f5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8084a6fc-60d0-43e5-ae6e-fe1d238d8993', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8084a6fc-60d0-43e5-ae6e-fe1d238d8993', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd142f68afa1489aa76784748e93db34', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 13:52:03 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:03.934 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 19820d75-49b2-44d2-b7cb-67580d105e30 in datapath 8084a6fc-60d0-43e5-ae6e-fe1d238d8993 updated
Oct 06 13:52:03 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:03.936 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8084a6fc-60d0-43e5-ae6e-fe1d238d8993, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 13:52:03 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:03.937 104072 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp7nhiw7nc/privsep.sock']
Oct 06 13:52:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:04.708 104072 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 06 13:52:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:04.709 104072 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp7nhiw7nc/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Oct 06 13:52:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:04.557 214189 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 06 13:52:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:04.562 214189 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 06 13:52:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:04.564 214189 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Oct 06 13:52:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:04.564 214189 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214189
Oct 06 13:52:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:04.712 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8c32d827-8866-4cf4-a467-3c6289c3892a]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 13:52:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:05.155 214189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:52:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:05.155 214189 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:52:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:05.156 214189 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:52:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:05.586 214189 INFO oslo_service.backend [-] Loading backend: eventlet
Oct 06 13:52:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:05.590 214189 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Oct 06 13:52:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:05.626 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b6498907-636f-4c2a-bd5f-48ac406ffcbf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 13:52:09 compute-0 PackageKit[130367]: daemon quit
Oct 06 13:52:09 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 06 13:52:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:11.333 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:52:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:11.334 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:52:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:52:11.334 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:52:19 compute-0 podman[214195]: 2025-10-06 13:52:19.221012749 +0000 UTC m=+0.079340176 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:52:22 compute-0 podman[214221]: 2025-10-06 13:52:22.268692476 +0000 UTC m=+0.125978386 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Oct 06 13:52:22 compute-0 podman[214220]: 2025-10-06 13:52:22.269471117 +0000 UTC m=+0.126319355 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 13:52:22 compute-0 podman[214219]: 2025-10-06 13:52:22.287620908 +0000 UTC m=+0.154554439 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller)
Oct 06 13:52:24 compute-0 sshd-session[214282]: banner exchange: Connection from 195.178.110.15 port 57074: invalid format
Oct 06 13:52:24 compute-0 sshd-session[214283]: banner exchange: Connection from 195.178.110.15 port 57090: invalid format
Oct 06 13:52:29 compute-0 podman[203308]: time="2025-10-06T13:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:52:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:52:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Oct 06 13:52:31 compute-0 podman[214284]: 2025-10-06 13:52:31.203789738 +0000 UTC m=+0.069431468 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 06 13:52:31 compute-0 openstack_network_exporter[205500]: ERROR   13:52:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:52:31 compute-0 openstack_network_exporter[205500]: ERROR   13:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:52:31 compute-0 openstack_network_exporter[205500]: ERROR   13:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:52:31 compute-0 openstack_network_exporter[205500]: ERROR   13:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:52:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:52:31 compute-0 openstack_network_exporter[205500]: ERROR   13:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:52:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:52:34 compute-0 podman[214306]: 2025-10-06 13:52:34.215572295 +0000 UTC m=+0.075678766 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Oct 06 13:52:36 compute-0 nova_compute[192903]: 2025-10-06 13:52:36.397 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:52:36 compute-0 nova_compute[192903]: 2025-10-06 13:52:36.910 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:52:36 compute-0 nova_compute[192903]: 2025-10-06 13:52:36.910 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:52:36 compute-0 nova_compute[192903]: 2025-10-06 13:52:36.910 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:52:36 compute-0 nova_compute[192903]: 2025-10-06 13:52:36.910 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:52:36 compute-0 nova_compute[192903]: 2025-10-06 13:52:36.911 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:52:36 compute-0 nova_compute[192903]: 2025-10-06 13:52:36.911 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:52:36 compute-0 nova_compute[192903]: 2025-10-06 13:52:36.911 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:52:36 compute-0 nova_compute[192903]: 2025-10-06 13:52:36.911 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.436 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.437 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.437 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.438 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.646 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.648 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.670 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.671 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6068MB free_disk=73.34028244018555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.672 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:52:37 compute-0 nova_compute[192903]: 2025-10-06 13:52:37.672 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:52:38 compute-0 nova_compute[192903]: 2025-10-06 13:52:38.738 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:52:38 compute-0 nova_compute[192903]: 2025-10-06 13:52:38.738 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:52:37 up 53 min,  0 user,  load average: 0.24, 0.61, 0.63\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:52:38 compute-0 nova_compute[192903]: 2025-10-06 13:52:38.757 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:52:39 compute-0 nova_compute[192903]: 2025-10-06 13:52:39.264 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:52:39 compute-0 nova_compute[192903]: 2025-10-06 13:52:39.774 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:52:39 compute-0 nova_compute[192903]: 2025-10-06 13:52:39.774 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:52:40 compute-0 nova_compute[192903]: 2025-10-06 13:52:40.954 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:52:50 compute-0 podman[214329]: 2025-10-06 13:52:50.211803784 +0000 UTC m=+0.065519604 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 13:52:53 compute-0 podman[214357]: 2025-10-06 13:52:53.226856859 +0000 UTC m=+0.070802217 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:52:53 compute-0 podman[214356]: 2025-10-06 13:52:53.233026526 +0000 UTC m=+0.079166274 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 13:52:53 compute-0 podman[214355]: 2025-10-06 13:52:53.253541351 +0000 UTC m=+0.110131402 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 06 13:52:59 compute-0 podman[203308]: time="2025-10-06T13:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:52:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:52:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Oct 06 13:53:01 compute-0 openstack_network_exporter[205500]: ERROR   13:53:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:53:01 compute-0 openstack_network_exporter[205500]: ERROR   13:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:53:01 compute-0 openstack_network_exporter[205500]: ERROR   13:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:53:01 compute-0 openstack_network_exporter[205500]: ERROR   13:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:53:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:53:01 compute-0 openstack_network_exporter[205500]: ERROR   13:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:53:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:53:02 compute-0 podman[214419]: 2025-10-06 13:53:02.221511012 +0000 UTC m=+0.080431328 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2)
Oct 06 13:53:05 compute-0 podman[214441]: 2025-10-06 13:53:05.230693169 +0000 UTC m=+0.092405812 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, release=1755695350, version=9.6, managed_by=edpm_ansible, distribution-scope=public)
Oct 06 13:53:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:53:11.334 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:53:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:53:11.335 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:53:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:53:11.335 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:53:21 compute-0 podman[214463]: 2025-10-06 13:53:21.226837163 +0000 UTC m=+0.079720818 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:53:24 compute-0 podman[214488]: 2025-10-06 13:53:24.210413128 +0000 UTC m=+0.059132931 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 06 13:53:24 compute-0 podman[214487]: 2025-10-06 13:53:24.235930218 +0000 UTC m=+0.080012026 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:53:24 compute-0 podman[214486]: 2025-10-06 13:53:24.267001169 +0000 UTC m=+0.117701616 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 06 13:53:29 compute-0 podman[203308]: time="2025-10-06T13:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:53:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:53:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Oct 06 13:53:31 compute-0 openstack_network_exporter[205500]: ERROR   13:53:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:53:31 compute-0 openstack_network_exporter[205500]: ERROR   13:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:53:31 compute-0 openstack_network_exporter[205500]: ERROR   13:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:53:31 compute-0 openstack_network_exporter[205500]: ERROR   13:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:53:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:53:31 compute-0 openstack_network_exporter[205500]: ERROR   13:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:53:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:53:33 compute-0 podman[214543]: 2025-10-06 13:53:33.203846226 +0000 UTC m=+0.068730801 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 13:53:33 compute-0 nova_compute[192903]: 2025-10-06 13:53:33.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:33 compute-0 nova_compute[192903]: 2025-10-06 13:53:33.583 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 13:53:34 compute-0 nova_compute[192903]: 2025-10-06 13:53:34.093 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 13:53:34 compute-0 nova_compute[192903]: 2025-10-06 13:53:34.094 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:34 compute-0 nova_compute[192903]: 2025-10-06 13:53:34.094 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 13:53:34 compute-0 nova_compute[192903]: 2025-10-06 13:53:34.601 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:36 compute-0 podman[214563]: 2025-10-06 13:53:36.189532718 +0000 UTC m=+0.060923309 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.107 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.108 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.108 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.108 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.108 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.109 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.624 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.625 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.625 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.625 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.798 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.799 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.814 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.815 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6089MB free_disk=73.34028244018555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.815 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:53:37 compute-0 nova_compute[192903]: 2025-10-06 13:53:37.815 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:53:38 compute-0 nova_compute[192903]: 2025-10-06 13:53:38.872 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:53:38 compute-0 nova_compute[192903]: 2025-10-06 13:53:38.872 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:53:37 up 54 min,  0 user,  load average: 0.08, 0.50, 0.59\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:53:38 compute-0 nova_compute[192903]: 2025-10-06 13:53:38.897 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:53:39 compute-0 nova_compute[192903]: 2025-10-06 13:53:39.413 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:53:39 compute-0 nova_compute[192903]: 2025-10-06 13:53:39.925 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:53:39 compute-0 nova_compute[192903]: 2025-10-06 13:53:39.925 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:53:40 compute-0 nova_compute[192903]: 2025-10-06 13:53:40.400 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:40 compute-0 nova_compute[192903]: 2025-10-06 13:53:40.400 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:40 compute-0 nova_compute[192903]: 2025-10-06 13:53:40.401 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:53:52 compute-0 podman[214584]: 2025-10-06 13:53:52.209729289 +0000 UTC m=+0.068465743 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:53:55 compute-0 podman[214609]: 2025-10-06 13:53:55.230842595 +0000 UTC m=+0.079275846 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 13:53:55 compute-0 podman[214608]: 2025-10-06 13:53:55.262300376 +0000 UTC m=+0.124838049 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 06 13:53:55 compute-0 podman[214615]: 2025-10-06 13:53:55.27169512 +0000 UTC m=+0.111513708 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 13:53:59 compute-0 podman[203308]: time="2025-10-06T13:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:53:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:53:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Oct 06 13:54:01 compute-0 openstack_network_exporter[205500]: ERROR   13:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:54:01 compute-0 openstack_network_exporter[205500]: ERROR   13:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:54:01 compute-0 openstack_network_exporter[205500]: ERROR   13:54:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:54:01 compute-0 openstack_network_exporter[205500]: ERROR   13:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:54:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:54:01 compute-0 openstack_network_exporter[205500]: ERROR   13:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:54:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:54:04 compute-0 podman[214673]: 2025-10-06 13:54:04.215578728 +0000 UTC m=+0.077967530 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 13:54:07 compute-0 podman[214695]: 2025-10-06 13:54:07.202422487 +0000 UTC m=+0.067060656 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Oct 06 13:54:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:54:11.337 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:54:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:54:11.337 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:54:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:54:11.338 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:54:23 compute-0 podman[214717]: 2025-10-06 13:54:23.219256428 +0000 UTC m=+0.076629174 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:54:26 compute-0 podman[214743]: 2025-10-06 13:54:26.218520962 +0000 UTC m=+0.074069904 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 13:54:26 compute-0 podman[214742]: 2025-10-06 13:54:26.233998361 +0000 UTC m=+0.088463584 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2)
Oct 06 13:54:26 compute-0 podman[214741]: 2025-10-06 13:54:26.285071283 +0000 UTC m=+0.147764559 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 13:54:29 compute-0 podman[203308]: time="2025-10-06T13:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:54:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:54:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Oct 06 13:54:31 compute-0 openstack_network_exporter[205500]: ERROR   13:54:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:54:31 compute-0 openstack_network_exporter[205500]: ERROR   13:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:54:31 compute-0 openstack_network_exporter[205500]: ERROR   13:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:54:31 compute-0 openstack_network_exporter[205500]: ERROR   13:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:54:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:54:31 compute-0 openstack_network_exporter[205500]: ERROR   13:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:54:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:54:35 compute-0 podman[214801]: 2025-10-06 13:54:35.194306883 +0000 UTC m=+0.064101776 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:54:35 compute-0 nova_compute[192903]: 2025-10-06 13:54:35.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:54:36 compute-0 nova_compute[192903]: 2025-10-06 13:54:36.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:54:36 compute-0 nova_compute[192903]: 2025-10-06 13:54:36.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:54:37 compute-0 nova_compute[192903]: 2025-10-06 13:54:37.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:54:37 compute-0 nova_compute[192903]: 2025-10-06 13:54:37.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:54:38 compute-0 podman[214823]: 2025-10-06 13:54:38.236539549 +0000 UTC m=+0.089083481 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 06 13:54:38 compute-0 nova_compute[192903]: 2025-10-06 13:54:38.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:54:38 compute-0 nova_compute[192903]: 2025-10-06 13:54:38.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:54:38 compute-0 nova_compute[192903]: 2025-10-06 13:54:38.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:54:38 compute-0 nova_compute[192903]: 2025-10-06 13:54:38.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.097 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.320 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.321 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.346 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.347 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6096MB free_disk=73.34049987792969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.347 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:54:39 compute-0 nova_compute[192903]: 2025-10-06 13:54:39.347 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:54:40 compute-0 nova_compute[192903]: 2025-10-06 13:54:40.422 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:54:40 compute-0 nova_compute[192903]: 2025-10-06 13:54:40.423 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:54:39 up 55 min,  0 user,  load average: 0.03, 0.40, 0.54\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:54:40 compute-0 nova_compute[192903]: 2025-10-06 13:54:40.474 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 13:54:40 compute-0 nova_compute[192903]: 2025-10-06 13:54:40.523 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 13:54:40 compute-0 nova_compute[192903]: 2025-10-06 13:54:40.524 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 13:54:40 compute-0 nova_compute[192903]: 2025-10-06 13:54:40.535 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 13:54:40 compute-0 nova_compute[192903]: 2025-10-06 13:54:40.570 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 13:54:40 compute-0 nova_compute[192903]: 2025-10-06 13:54:40.613 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:54:41 compute-0 nova_compute[192903]: 2025-10-06 13:54:41.121 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:54:41 compute-0 nova_compute[192903]: 2025-10-06 13:54:41.633 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:54:41 compute-0 nova_compute[192903]: 2025-10-06 13:54:41.633 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.286s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:54:43 compute-0 nova_compute[192903]: 2025-10-06 13:54:43.633 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:54:54 compute-0 podman[214845]: 2025-10-06 13:54:54.226117314 +0000 UTC m=+0.080982392 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:54:57 compute-0 podman[214871]: 2025-10-06 13:54:57.21027683 +0000 UTC m=+0.072927694 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct 06 13:54:57 compute-0 podman[214872]: 2025-10-06 13:54:57.229298095 +0000 UTC m=+0.080255042 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 13:54:57 compute-0 podman[214870]: 2025-10-06 13:54:57.269591005 +0000 UTC m=+0.127812359 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 06 13:54:59 compute-0 podman[203308]: time="2025-10-06T13:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:54:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:54:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2979 "" "Go-http-client/1.1"
Oct 06 13:55:01 compute-0 openstack_network_exporter[205500]: ERROR   13:55:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:55:01 compute-0 openstack_network_exporter[205500]: ERROR   13:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:55:01 compute-0 openstack_network_exporter[205500]: ERROR   13:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:55:01 compute-0 openstack_network_exporter[205500]: ERROR   13:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:55:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:55:01 compute-0 openstack_network_exporter[205500]: ERROR   13:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:55:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:55:06 compute-0 podman[214937]: 2025-10-06 13:55:06.231178832 +0000 UTC m=+0.090074259 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 06 13:55:09 compute-0 podman[214961]: 2025-10-06 13:55:09.240749925 +0000 UTC m=+0.091382413 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm)
Oct 06 13:55:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:11.339 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:55:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:11.339 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:55:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:11.340 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:55:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:12.482 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 13:55:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:12.483 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 13:55:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:13.485 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 13:55:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:16.431 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:75:12 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ececa7fa-ba70-46b6-bbc0-dc1ea08e7284', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ececa7fa-ba70-46b6-bbc0-dc1ea08e7284', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '00f43357e4a4499298f9a295a396d640', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a355190-4bd2-4b1f-b9ff-837ec4f7fcd9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=513b60b9-1abf-4187-abf6-5b8eeebf4ee6) old=Port_Binding(mac=['fa:16:3e:16:75:12'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ececa7fa-ba70-46b6-bbc0-dc1ea08e7284', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ececa7fa-ba70-46b6-bbc0-dc1ea08e7284', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '00f43357e4a4499298f9a295a396d640', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 13:55:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:16.432 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 513b60b9-1abf-4187-abf6-5b8eeebf4ee6 in datapath ececa7fa-ba70-46b6-bbc0-dc1ea08e7284 updated
Oct 06 13:55:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:16.433 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ececa7fa-ba70-46b6-bbc0-dc1ea08e7284, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 13:55:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:16.438 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bd45a176-4b2f-4e11-a601-6f052b682aa0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 13:55:25 compute-0 podman[214985]: 2025-10-06 13:55:25.229781395 +0000 UTC m=+0.082503393 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:55:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:27.871 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:37:d8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4a13c6b0-d8b2-47e9-80c8-3791f7e98f51', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a13c6b0-d8b2-47e9-80c8-3791f7e98f51', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c5c1e29456f476a8ca95359dfa35689', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e838f77-7131-4620-8a31-1bb0b1ebf792, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4880daac-9b59-4baa-ba6d-e3c36b7d0493) old=Port_Binding(mac=['fa:16:3e:e5:37:d8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4a13c6b0-d8b2-47e9-80c8-3791f7e98f51', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a13c6b0-d8b2-47e9-80c8-3791f7e98f51', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c5c1e29456f476a8ca95359dfa35689', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 13:55:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:27.872 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4880daac-9b59-4baa-ba6d-e3c36b7d0493 in datapath 4a13c6b0-d8b2-47e9-80c8-3791f7e98f51 updated
Oct 06 13:55:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:27.873 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a13c6b0-d8b2-47e9-80c8-3791f7e98f51, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 13:55:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:55:27.875 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[18ac7e7f-eae2-42b8-8c74-2c551eeb8eb4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 13:55:28 compute-0 podman[215011]: 2025-10-06 13:55:28.214108096 +0000 UTC m=+0.059946233 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 06 13:55:28 compute-0 podman[215010]: 2025-10-06 13:55:28.221164547 +0000 UTC m=+0.070129169 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Oct 06 13:55:28 compute-0 podman[215009]: 2025-10-06 13:55:28.225781522 +0000 UTC m=+0.087524529 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 13:55:29 compute-0 podman[203308]: time="2025-10-06T13:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:55:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:55:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2985 "" "Go-http-client/1.1"
Oct 06 13:55:31 compute-0 openstack_network_exporter[205500]: ERROR   13:55:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:55:31 compute-0 openstack_network_exporter[205500]: ERROR   13:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:55:31 compute-0 openstack_network_exporter[205500]: ERROR   13:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:55:31 compute-0 openstack_network_exporter[205500]: ERROR   13:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:55:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:55:31 compute-0 openstack_network_exporter[205500]: ERROR   13:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:55:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:55:36 compute-0 nova_compute[192903]: 2025-10-06 13:55:36.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:55:37 compute-0 podman[215072]: 2025-10-06 13:55:37.199693723 +0000 UTC m=+0.069505201 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 06 13:55:37 compute-0 nova_compute[192903]: 2025-10-06 13:55:37.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:55:37 compute-0 nova_compute[192903]: 2025-10-06 13:55:37.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:55:37 compute-0 nova_compute[192903]: 2025-10-06 13:55:37.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:55:40 compute-0 podman[215093]: 2025-10-06 13:55:40.240584313 +0000 UTC m=+0.093951212 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal)
Oct 06 13:55:40 compute-0 nova_compute[192903]: 2025-10-06 13:55:40.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:55:40 compute-0 nova_compute[192903]: 2025-10-06 13:55:40.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:55:40 compute-0 nova_compute[192903]: 2025-10-06 13:55:40.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:55:40 compute-0 nova_compute[192903]: 2025-10-06 13:55:40.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.100 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.325 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.326 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.344 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.345 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6103MB free_disk=73.34049987792969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.345 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:55:41 compute-0 nova_compute[192903]: 2025-10-06 13:55:41.346 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:55:42 compute-0 nova_compute[192903]: 2025-10-06 13:55:42.401 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:55:42 compute-0 nova_compute[192903]: 2025-10-06 13:55:42.402 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:55:41 up 56 min,  0 user,  load average: 0.09, 0.34, 0.51\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:55:42 compute-0 nova_compute[192903]: 2025-10-06 13:55:42.433 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:55:42 compute-0 nova_compute[192903]: 2025-10-06 13:55:42.943 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:55:43 compute-0 nova_compute[192903]: 2025-10-06 13:55:43.461 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:55:43 compute-0 nova_compute[192903]: 2025-10-06 13:55:43.461 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:55:44 compute-0 nova_compute[192903]: 2025-10-06 13:55:44.461 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:55:56 compute-0 podman[215118]: 2025-10-06 13:55:56.193050871 +0000 UTC m=+0.056106971 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 13:55:59 compute-0 podman[215144]: 2025-10-06 13:55:59.211740249 +0000 UTC m=+0.064915286 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct 06 13:55:59 compute-0 podman[215143]: 2025-10-06 13:55:59.224427018 +0000 UTC m=+0.077006329 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 06 13:55:59 compute-0 podman[215142]: 2025-10-06 13:55:59.26267442 +0000 UTC m=+0.120263805 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Oct 06 13:55:59 compute-0 podman[203308]: time="2025-10-06T13:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:55:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:55:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2980 "" "Go-http-client/1.1"
Oct 06 13:56:01 compute-0 openstack_network_exporter[205500]: ERROR   13:56:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:56:01 compute-0 openstack_network_exporter[205500]: ERROR   13:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:56:01 compute-0 openstack_network_exporter[205500]: ERROR   13:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:56:01 compute-0 openstack_network_exporter[205500]: ERROR   13:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:56:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:56:01 compute-0 openstack_network_exporter[205500]: ERROR   13:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:56:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:56:08 compute-0 podman[215202]: 2025-10-06 13:56:08.236141888 +0000 UTC m=+0.099871740 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 06 13:56:11 compute-0 podman[215223]: 2025-10-06 13:56:11.217981812 +0000 UTC m=+0.080420511 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 06 13:56:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:11.342 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:56:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:11.342 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:56:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:11.343 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:56:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:14.157 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 13:56:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:14.159 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 13:56:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:22.162 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 13:56:27 compute-0 podman[215247]: 2025-10-06 13:56:27.214778871 +0000 UTC m=+0.084104610 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:56:29 compute-0 podman[203308]: time="2025-10-06T13:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:56:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:56:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Oct 06 13:56:30 compute-0 podman[215274]: 2025-10-06 13:56:30.208154662 +0000 UTC m=+0.061724491 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Oct 06 13:56:30 compute-0 podman[215273]: 2025-10-06 13:56:30.231458195 +0000 UTC m=+0.091065125 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 13:56:30 compute-0 podman[215272]: 2025-10-06 13:56:30.247774091 +0000 UTC m=+0.103509367 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:56:31 compute-0 openstack_network_exporter[205500]: ERROR   13:56:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:56:31 compute-0 openstack_network_exporter[205500]: ERROR   13:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:56:31 compute-0 openstack_network_exporter[205500]: ERROR   13:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:56:31 compute-0 openstack_network_exporter[205500]: ERROR   13:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:56:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:56:31 compute-0 openstack_network_exporter[205500]: ERROR   13:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:56:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:56:37 compute-0 nova_compute[192903]: 2025-10-06 13:56:37.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:56:37 compute-0 nova_compute[192903]: 2025-10-06 13:56:37.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:56:39 compute-0 podman[215335]: 2025-10-06 13:56:39.192111389 +0000 UTC m=+0.058280079 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:56:39 compute-0 nova_compute[192903]: 2025-10-06 13:56:39.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:56:39 compute-0 nova_compute[192903]: 2025-10-06 13:56:39.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:56:40 compute-0 nova_compute[192903]: 2025-10-06 13:56:40.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:56:40 compute-0 nova_compute[192903]: 2025-10-06 13:56:40.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:56:41 compute-0 nova_compute[192903]: 2025-10-06 13:56:41.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:56:41 compute-0 nova_compute[192903]: 2025-10-06 13:56:41.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.095 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:56:42 compute-0 podman[215356]: 2025-10-06 13:56:42.285131584 +0000 UTC m=+0.131353532 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.309 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.310 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.343 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.343 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6089MB free_disk=73.34049987792969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.344 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:56:42 compute-0 nova_compute[192903]: 2025-10-06 13:56:42.344 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:56:43 compute-0 nova_compute[192903]: 2025-10-06 13:56:43.403 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:56:43 compute-0 nova_compute[192903]: 2025-10-06 13:56:43.403 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:56:42 up 57 min,  0 user,  load average: 0.03, 0.27, 0.48\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:56:43 compute-0 nova_compute[192903]: 2025-10-06 13:56:43.430 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:56:43 compute-0 nova_compute[192903]: 2025-10-06 13:56:43.939 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:56:44 compute-0 nova_compute[192903]: 2025-10-06 13:56:44.450 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:56:44 compute-0 nova_compute[192903]: 2025-10-06 13:56:44.451 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:56:45 compute-0 nova_compute[192903]: 2025-10-06 13:56:45.451 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:56:45 compute-0 nova_compute[192903]: 2025-10-06 13:56:45.452 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:56:58 compute-0 podman[215378]: 2025-10-06 13:56:58.191142365 +0000 UTC m=+0.059241354 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 13:56:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:58.567 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:cc:8f 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-842421ce-fb5a-4d58-ae3f-8feb04286e35', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-842421ce-fb5a-4d58-ae3f-8feb04286e35', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5db1e2ad4889484e8864ca433529b41f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51ffe042-4dd0-44c6-8e1d-91c94a3f0275, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=61c52e94-9428-47a0-a71e-58a5bca262c9) old=Port_Binding(mac=['fa:16:3e:dd:cc:8f'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-842421ce-fb5a-4d58-ae3f-8feb04286e35', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-842421ce-fb5a-4d58-ae3f-8feb04286e35', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5db1e2ad4889484e8864ca433529b41f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 13:56:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:58.569 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 61c52e94-9428-47a0-a71e-58a5bca262c9 in datapath 842421ce-fb5a-4d58-ae3f-8feb04286e35 updated
Oct 06 13:56:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:58.570 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 842421ce-fb5a-4d58-ae3f-8feb04286e35, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 13:56:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:56:58.572 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8b69edc3-9714-4de3-95e3-4e316c2c1410]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 13:56:59 compute-0 podman[203308]: time="2025-10-06T13:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:56:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:56:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Oct 06 13:57:01 compute-0 podman[215404]: 2025-10-06 13:57:01.209654329 +0000 UTC m=+0.069094548 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 13:57:01 compute-0 podman[215405]: 2025-10-06 13:57:01.212832534 +0000 UTC m=+0.058178956 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 13:57:01 compute-0 podman[215403]: 2025-10-06 13:57:01.255281749 +0000 UTC m=+0.109488718 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 13:57:01 compute-0 openstack_network_exporter[205500]: ERROR   13:57:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:57:01 compute-0 openstack_network_exporter[205500]: ERROR   13:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:57:01 compute-0 openstack_network_exporter[205500]: ERROR   13:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:57:01 compute-0 openstack_network_exporter[205500]: ERROR   13:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:57:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:57:01 compute-0 openstack_network_exporter[205500]: ERROR   13:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:57:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:57:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:57:04.908 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:12:1b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-48cf72b3-74fb-405f-a638-fa7f0c90f64b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48cf72b3-74fb-405f-a638-fa7f0c90f64b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b057870c78b4498819d55c93574110b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c39f96fa-c245-420c-9f59-4627d1b19a04, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ebd3b5b9-301d-4dd8-8342-69166129342c) old=Port_Binding(mac=['fa:16:3e:ad:12:1b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-48cf72b3-74fb-405f-a638-fa7f0c90f64b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48cf72b3-74fb-405f-a638-fa7f0c90f64b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b057870c78b4498819d55c93574110b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 13:57:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:57:04.909 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ebd3b5b9-301d-4dd8-8342-69166129342c in datapath 48cf72b3-74fb-405f-a638-fa7f0c90f64b updated
Oct 06 13:57:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:57:04.910 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48cf72b3-74fb-405f-a638-fa7f0c90f64b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 13:57:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:57:04.911 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe7a3b4-5ed2-4f0a-a904-e182b4ca1513]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 13:57:10 compute-0 podman[215465]: 2025-10-06 13:57:10.234895661 +0000 UTC m=+0.082544678 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 06 13:57:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:57:11.343 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:57:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:57:11.344 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:57:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:57:11.344 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:57:13 compute-0 podman[215487]: 2025-10-06 13:57:13.22743842 +0000 UTC m=+0.084759097 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Oct 06 13:57:29 compute-0 podman[215508]: 2025-10-06 13:57:29.196386434 +0000 UTC m=+0.063640642 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:57:29 compute-0 podman[203308]: time="2025-10-06T13:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:57:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:57:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2986 "" "Go-http-client/1.1"
Oct 06 13:57:31 compute-0 openstack_network_exporter[205500]: ERROR   13:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:57:31 compute-0 openstack_network_exporter[205500]: ERROR   13:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:57:31 compute-0 openstack_network_exporter[205500]: ERROR   13:57:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:57:31 compute-0 openstack_network_exporter[205500]: ERROR   13:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:57:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:57:31 compute-0 openstack_network_exporter[205500]: ERROR   13:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:57:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:57:32 compute-0 podman[215534]: 2025-10-06 13:57:32.203260749 +0000 UTC m=+0.062814110 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 13:57:32 compute-0 podman[215535]: 2025-10-06 13:57:32.227983769 +0000 UTC m=+0.075859148 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 06 13:57:32 compute-0 podman[215533]: 2025-10-06 13:57:32.254857417 +0000 UTC m=+0.117438099 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 06 13:57:39 compute-0 nova_compute[192903]: 2025-10-06 13:57:39.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:57:39 compute-0 nova_compute[192903]: 2025-10-06 13:57:39.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:57:40 compute-0 nova_compute[192903]: 2025-10-06 13:57:40.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:57:40 compute-0 nova_compute[192903]: 2025-10-06 13:57:40.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:57:40 compute-0 nova_compute[192903]: 2025-10-06 13:57:40.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:57:41 compute-0 podman[215599]: 2025-10-06 13:57:41.223797214 +0000 UTC m=+0.080651917 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:57:42 compute-0 nova_compute[192903]: 2025-10-06 13:57:42.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:57:42 compute-0 nova_compute[192903]: 2025-10-06 13:57:42.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.095 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.285 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.287 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.312 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.313 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6093MB free_disk=73.34049987792969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.313 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:57:43 compute-0 nova_compute[192903]: 2025-10-06 13:57:43.314 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:57:44 compute-0 podman[215620]: 2025-10-06 13:57:44.229609698 +0000 UTC m=+0.089090972 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, distribution-scope=public, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Oct 06 13:57:44 compute-0 nova_compute[192903]: 2025-10-06 13:57:44.366 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:57:44 compute-0 nova_compute[192903]: 2025-10-06 13:57:44.367 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:57:43 up 58 min,  0 user,  load average: 0.01, 0.22, 0.44\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:57:44 compute-0 nova_compute[192903]: 2025-10-06 13:57:44.389 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:57:44 compute-0 nova_compute[192903]: 2025-10-06 13:57:44.897 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:57:45 compute-0 nova_compute[192903]: 2025-10-06 13:57:45.407 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:57:45 compute-0 nova_compute[192903]: 2025-10-06 13:57:45.408 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:57:47 compute-0 nova_compute[192903]: 2025-10-06 13:57:47.411 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:57:47 compute-0 nova_compute[192903]: 2025-10-06 13:57:47.411 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:57:59 compute-0 podman[203308]: time="2025-10-06T13:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:57:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:57:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Oct 06 13:58:00 compute-0 podman[215641]: 2025-10-06 13:58:00.211610181 +0000 UTC m=+0.073368652 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 13:58:01 compute-0 openstack_network_exporter[205500]: ERROR   13:58:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:58:01 compute-0 openstack_network_exporter[205500]: ERROR   13:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:58:01 compute-0 openstack_network_exporter[205500]: ERROR   13:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:58:01 compute-0 openstack_network_exporter[205500]: ERROR   13:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:58:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:58:01 compute-0 openstack_network_exporter[205500]: ERROR   13:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:58:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:58:03 compute-0 podman[215666]: 2025-10-06 13:58:03.225197562 +0000 UTC m=+0.075951521 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 06 13:58:03 compute-0 podman[215667]: 2025-10-06 13:58:03.230743961 +0000 UTC m=+0.083959246 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 06 13:58:03 compute-0 podman[215665]: 2025-10-06 13:58:03.281052155 +0000 UTC m=+0.136480119 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Oct 06 13:58:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:58:11.345 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:58:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:58:11.345 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:58:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:58:11.345 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:58:12 compute-0 podman[215731]: 2025-10-06 13:58:12.217214964 +0000 UTC m=+0.075494470 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid)
Oct 06 13:58:15 compute-0 podman[215752]: 2025-10-06 13:58:15.210997701 +0000 UTC m=+0.077840750 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-type=git, architecture=x86_64, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Oct 06 13:58:29 compute-0 podman[203308]: time="2025-10-06T13:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:58:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:58:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Oct 06 13:58:31 compute-0 podman[215774]: 2025-10-06 13:58:31.222298688 +0000 UTC m=+0.069520613 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:58:31 compute-0 openstack_network_exporter[205500]: ERROR   13:58:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:58:31 compute-0 openstack_network_exporter[205500]: ERROR   13:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:58:31 compute-0 openstack_network_exporter[205500]: ERROR   13:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:58:31 compute-0 openstack_network_exporter[205500]: ERROR   13:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:58:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:58:31 compute-0 openstack_network_exporter[205500]: ERROR   13:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:58:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:58:34 compute-0 podman[215799]: 2025-10-06 13:58:34.234643812 +0000 UTC m=+0.079294259 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 13:58:34 compute-0 podman[215800]: 2025-10-06 13:58:34.247029256 +0000 UTC m=+0.085566643 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Oct 06 13:58:34 compute-0 podman[215798]: 2025-10-06 13:58:34.273751067 +0000 UTC m=+0.125116620 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 06 13:58:39 compute-0 nova_compute[192903]: 2025-10-06 13:58:39.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:39 compute-0 nova_compute[192903]: 2025-10-06 13:58:39.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:39 compute-0 nova_compute[192903]: 2025-10-06 13:58:39.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 13:58:40 compute-0 nova_compute[192903]: 2025-10-06 13:58:40.088 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 13:58:41 compute-0 nova_compute[192903]: 2025-10-06 13:58:41.084 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:41 compute-0 nova_compute[192903]: 2025-10-06 13:58:41.084 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:41 compute-0 nova_compute[192903]: 2025-10-06 13:58:41.085 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:41 compute-0 nova_compute[192903]: 2025-10-06 13:58:41.085 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:58:41 compute-0 nova_compute[192903]: 2025-10-06 13:58:41.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:43 compute-0 podman[215863]: 2025-10-06 13:58:43.222607909 +0000 UTC m=+0.084863575 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20250930, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 13:58:43 compute-0 nova_compute[192903]: 2025-10-06 13:58:43.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:43 compute-0 nova_compute[192903]: 2025-10-06 13:58:43.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:43 compute-0 nova_compute[192903]: 2025-10-06 13:58:43.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 13:58:44 compute-0 nova_compute[192903]: 2025-10-06 13:58:44.093 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:45 compute-0 nova_compute[192903]: 2025-10-06 13:58:45.603 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.121 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.122 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.122 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.123 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:58:46 compute-0 podman[215884]: 2025-10-06 13:58:46.216262044 +0000 UTC m=+0.073781335 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.315 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.316 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.342 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.344 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6102MB free_disk=73.34049987792969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.344 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:58:46 compute-0 nova_compute[192903]: 2025-10-06 13:58:46.345 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:58:47 compute-0 nova_compute[192903]: 2025-10-06 13:58:47.398 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:58:47 compute-0 nova_compute[192903]: 2025-10-06 13:58:47.398 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:58:46 up 59 min,  0 user,  load average: 0.14, 0.21, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:58:47 compute-0 nova_compute[192903]: 2025-10-06 13:58:47.426 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:58:47 compute-0 nova_compute[192903]: 2025-10-06 13:58:47.935 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:58:48 compute-0 nova_compute[192903]: 2025-10-06 13:58:48.447 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:58:48 compute-0 nova_compute[192903]: 2025-10-06 13:58:48.447 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:58:49 compute-0 nova_compute[192903]: 2025-10-06 13:58:49.426 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:49 compute-0 nova_compute[192903]: 2025-10-06 13:58:49.426 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:58:59 compute-0 podman[203308]: time="2025-10-06T13:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:58:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:58:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2971 "" "Go-http-client/1.1"
Oct 06 13:59:01 compute-0 openstack_network_exporter[205500]: ERROR   13:59:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:59:01 compute-0 openstack_network_exporter[205500]: ERROR   13:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:59:01 compute-0 openstack_network_exporter[205500]: ERROR   13:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:59:01 compute-0 openstack_network_exporter[205500]: ERROR   13:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:59:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:59:01 compute-0 openstack_network_exporter[205500]: ERROR   13:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:59:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:59:02 compute-0 podman[215907]: 2025-10-06 13:59:02.243338725 +0000 UTC m=+0.102964870 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 13:59:05 compute-0 podman[215933]: 2025-10-06 13:59:05.230281865 +0000 UTC m=+0.076851375 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:59:05 compute-0 podman[215932]: 2025-10-06 13:59:05.24765325 +0000 UTC m=+0.099782016 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 13:59:05 compute-0 podman[215931]: 2025-10-06 13:59:05.256373579 +0000 UTC m=+0.114302047 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:59:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:59:11.347 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:59:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:59:11.347 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:59:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 13:59:11.347 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:59:14 compute-0 podman[215991]: 2025-10-06 13:59:14.213431735 +0000 UTC m=+0.073965859 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 13:59:15 compute-0 nova_compute[192903]: 2025-10-06 13:59:15.634 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:59:17 compute-0 podman[216013]: 2025-10-06 13:59:17.190147375 +0000 UTC m=+0.063144106 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6)
Oct 06 13:59:29 compute-0 podman[203308]: time="2025-10-06T13:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:59:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:59:29 compute-0 podman[203308]: @ - - [06/Oct/2025:13:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2976 "" "Go-http-client/1.1"
Oct 06 13:59:31 compute-0 openstack_network_exporter[205500]: ERROR   13:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:59:31 compute-0 openstack_network_exporter[205500]: ERROR   13:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 13:59:31 compute-0 openstack_network_exporter[205500]: ERROR   13:59:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 13:59:31 compute-0 openstack_network_exporter[205500]: ERROR   13:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 13:59:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:59:31 compute-0 openstack_network_exporter[205500]: ERROR   13:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 13:59:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 13:59:33 compute-0 podman[216034]: 2025-10-06 13:59:33.212682298 +0000 UTC m=+0.074180975 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 13:59:36 compute-0 podman[216059]: 2025-10-06 13:59:36.202927593 +0000 UTC m=+0.059611293 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 13:59:36 compute-0 podman[216060]: 2025-10-06 13:59:36.217835824 +0000 UTC m=+0.067199142 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 13:59:36 compute-0 podman[216058]: 2025-10-06 13:59:36.252727608 +0000 UTC m=+0.111574355 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 06 13:59:41 compute-0 nova_compute[192903]: 2025-10-06 13:59:41.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:59:41 compute-0 nova_compute[192903]: 2025-10-06 13:59:41.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:59:41 compute-0 nova_compute[192903]: 2025-10-06 13:59:41.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:59:41 compute-0 nova_compute[192903]: 2025-10-06 13:59:41.585 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 13:59:42 compute-0 nova_compute[192903]: 2025-10-06 13:59:42.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:59:45 compute-0 podman[216123]: 2025-10-06 13:59:45.208080382 +0000 UTC m=+0.071248539 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 13:59:45 compute-0 nova_compute[192903]: 2025-10-06 13:59:45.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:59:45 compute-0 nova_compute[192903]: 2025-10-06 13:59:45.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.095 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.319 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.320 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.343 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.344 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6091MB free_disk=73.34049987792969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.345 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 13:59:46 compute-0 nova_compute[192903]: 2025-10-06 13:59:46.345 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 13:59:47 compute-0 nova_compute[192903]: 2025-10-06 13:59:47.445 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 13:59:47 compute-0 nova_compute[192903]: 2025-10-06 13:59:47.446 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:59:46 up  1:00,  0 user,  load average: 0.05, 0.17, 0.39\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 13:59:47 compute-0 nova_compute[192903]: 2025-10-06 13:59:47.485 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 13:59:47 compute-0 nova_compute[192903]: 2025-10-06 13:59:47.521 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 13:59:47 compute-0 nova_compute[192903]: 2025-10-06 13:59:47.522 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 13:59:47 compute-0 nova_compute[192903]: 2025-10-06 13:59:47.534 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 13:59:47 compute-0 nova_compute[192903]: 2025-10-06 13:59:47.554 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 13:59:47 compute-0 nova_compute[192903]: 2025-10-06 13:59:47.573 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 13:59:48 compute-0 nova_compute[192903]: 2025-10-06 13:59:48.080 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 13:59:48 compute-0 podman[216144]: 2025-10-06 13:59:48.219686608 +0000 UTC m=+0.080409378 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 06 13:59:48 compute-0 nova_compute[192903]: 2025-10-06 13:59:48.590 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 13:59:48 compute-0 nova_compute[192903]: 2025-10-06 13:59:48.591 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.245s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 13:59:49 compute-0 nova_compute[192903]: 2025-10-06 13:59:49.589 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:59:49 compute-0 nova_compute[192903]: 2025-10-06 13:59:49.590 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 13:59:59 compute-0 podman[203308]: time="2025-10-06T13:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 13:59:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 13:59:59 compute-0 podman[203308]: @ - - [06/Oct/2025:13:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Oct 06 14:00:01 compute-0 openstack_network_exporter[205500]: ERROR   14:00:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:00:01 compute-0 openstack_network_exporter[205500]: ERROR   14:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:00:01 compute-0 openstack_network_exporter[205500]: ERROR   14:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:00:01 compute-0 openstack_network_exporter[205500]: ERROR   14:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:00:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:00:01 compute-0 openstack_network_exporter[205500]: ERROR   14:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:00:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:00:04 compute-0 podman[216165]: 2025-10-06 14:00:04.239585219 +0000 UTC m=+0.100464894 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:00:06 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:06.284 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:00:06 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:06.285 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:00:07 compute-0 podman[216190]: 2025-10-06 14:00:07.245426033 +0000 UTC m=+0.088899601 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 06 14:00:07 compute-0 podman[216189]: 2025-10-06 14:00:07.245785842 +0000 UTC m=+0.096002377 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:00:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:07.286 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:00:07 compute-0 podman[216188]: 2025-10-06 14:00:07.303931986 +0000 UTC m=+0.160341343 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 06 14:00:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:11.348 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:00:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:11.348 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:00:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:11.348 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:00:16 compute-0 podman[216249]: 2025-10-06 14:00:16.219889105 +0000 UTC m=+0.075433092 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 06 14:00:19 compute-0 podman[216269]: 2025-10-06 14:00:19.222426288 +0000 UTC m=+0.081054682 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Oct 06 14:00:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:22.871 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:b0:9e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bca070f26184ccf81c59294881a8fb1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51f46be0-262e-47b6-a191-ce233c22ccf9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ab1436cf-cf66-4156-9fc9-6221146f6d00) old=Port_Binding(mac=['fa:16:3e:b3:b0:9e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bca070f26184ccf81c59294881a8fb1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:00:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:22.872 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ab1436cf-cf66-4156-9fc9-6221146f6d00 in datapath 79a9c966-c2e3-4299-a86c-5b892b2c16bc updated
Oct 06 14:00:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:22.873 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79a9c966-c2e3-4299-a86c-5b892b2c16bc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:00:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:22.874 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a7492139-4fd5-4502-aab1-cad12b14cd21]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:00:29 compute-0 podman[203308]: time="2025-10-06T14:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:00:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:00:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2985 "" "Go-http-client/1.1"
Oct 06 14:00:29 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:29.865 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:e8:1d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2a68ecc4-c8ee-470a-b631-314488e5cf20', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a68ecc4-c8ee-470a-b631-314488e5cf20', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2fb66390c2444d9f9fb655ec6a836510', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e2491ec-f235-44b2-adcb-a23ce194daf1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=91c947af-8f36-4d36-989b-38ce108dd9d0) old=Port_Binding(mac=['fa:16:3e:39:e8:1d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2a68ecc4-c8ee-470a-b631-314488e5cf20', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a68ecc4-c8ee-470a-b631-314488e5cf20', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2fb66390c2444d9f9fb655ec6a836510', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:00:29 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:29.866 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 91c947af-8f36-4d36-989b-38ce108dd9d0 in datapath 2a68ecc4-c8ee-470a-b631-314488e5cf20 updated
Oct 06 14:00:29 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:29.866 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2a68ecc4-c8ee-470a-b631-314488e5cf20, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:00:29 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:00:29.867 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8305cff3-456d-42c6-b1d5-b9222fa78ba6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:00:31 compute-0 openstack_network_exporter[205500]: ERROR   14:00:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:00:31 compute-0 openstack_network_exporter[205500]: ERROR   14:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:00:31 compute-0 openstack_network_exporter[205500]: ERROR   14:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:00:31 compute-0 openstack_network_exporter[205500]: ERROR   14:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:00:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:00:31 compute-0 openstack_network_exporter[205500]: ERROR   14:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:00:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:00:35 compute-0 podman[216291]: 2025-10-06 14:00:35.200613896 +0000 UTC m=+0.070082916 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:00:38 compute-0 podman[216318]: 2025-10-06 14:00:38.20946281 +0000 UTC m=+0.063143918 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 14:00:38 compute-0 podman[216317]: 2025-10-06 14:00:38.215876504 +0000 UTC m=+0.067309871 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4)
Oct 06 14:00:38 compute-0 podman[216316]: 2025-10-06 14:00:38.247820417 +0000 UTC m=+0.103490130 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 06 14:00:41 compute-0 nova_compute[192903]: 2025-10-06 14:00:41.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:00:42 compute-0 nova_compute[192903]: 2025-10-06 14:00:42.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:00:42 compute-0 nova_compute[192903]: 2025-10-06 14:00:42.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:00:42 compute-0 nova_compute[192903]: 2025-10-06 14:00:42.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:00:43 compute-0 nova_compute[192903]: 2025-10-06 14:00:43.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:00:44 compute-0 nova_compute[192903]: 2025-10-06 14:00:44.087 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:00:45 compute-0 nova_compute[192903]: 2025-10-06 14:00:45.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:00:45 compute-0 nova_compute[192903]: 2025-10-06 14:00:45.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.119 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.120 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.120 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.120 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.288 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.289 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.303 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.303 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6089MB free_disk=73.34049987792969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.303 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:00:46 compute-0 nova_compute[192903]: 2025-10-06 14:00:46.304 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:00:47 compute-0 podman[216376]: 2025-10-06 14:00:47.201263381 +0000 UTC m=+0.065347008 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 06 14:00:47 compute-0 nova_compute[192903]: 2025-10-06 14:00:47.345 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:00:47 compute-0 nova_compute[192903]: 2025-10-06 14:00:47.346 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:00:46 up  1:01,  0 user,  load average: 0.02, 0.13, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:00:47 compute-0 nova_compute[192903]: 2025-10-06 14:00:47.372 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:00:47 compute-0 nova_compute[192903]: 2025-10-06 14:00:47.975 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:00:48 compute-0 nova_compute[192903]: 2025-10-06 14:00:48.485 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:00:48 compute-0 nova_compute[192903]: 2025-10-06 14:00:48.486 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.182s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:00:49 compute-0 nova_compute[192903]: 2025-10-06 14:00:49.487 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:00:49 compute-0 nova_compute[192903]: 2025-10-06 14:00:49.487 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:00:50 compute-0 podman[216396]: 2025-10-06 14:00:50.229487348 +0000 UTC m=+0.087721723 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Oct 06 14:00:59 compute-0 podman[203308]: time="2025-10-06T14:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:00:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:00:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Oct 06 14:01:01 compute-0 openstack_network_exporter[205500]: ERROR   14:01:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:01:01 compute-0 openstack_network_exporter[205500]: ERROR   14:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:01:01 compute-0 openstack_network_exporter[205500]: ERROR   14:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:01:01 compute-0 openstack_network_exporter[205500]: ERROR   14:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:01:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:01:01 compute-0 openstack_network_exporter[205500]: ERROR   14:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:01:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:01:01 compute-0 CROND[216420]: (root) CMD (run-parts /etc/cron.hourly)
Oct 06 14:01:01 compute-0 run-parts[216423]: (/etc/cron.hourly) starting 0anacron
Oct 06 14:01:01 compute-0 run-parts[216429]: (/etc/cron.hourly) finished 0anacron
Oct 06 14:01:01 compute-0 CROND[216419]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 06 14:01:04 compute-0 nova_compute[192903]: 2025-10-06 14:01:04.453 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:04 compute-0 nova_compute[192903]: 2025-10-06 14:01:04.453 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:04 compute-0 nova_compute[192903]: 2025-10-06 14:01:04.967 2 DEBUG nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:01:05 compute-0 nova_compute[192903]: 2025-10-06 14:01:05.585 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:05 compute-0 nova_compute[192903]: 2025-10-06 14:01:05.586 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:05 compute-0 nova_compute[192903]: 2025-10-06 14:01:05.595 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:01:05 compute-0 nova_compute[192903]: 2025-10-06 14:01:05.596 2 INFO nova.compute.claims [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:01:06 compute-0 podman[216430]: 2025-10-06 14:01:06.189109376 +0000 UTC m=+0.058750130 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:01:06 compute-0 nova_compute[192903]: 2025-10-06 14:01:06.673 2 DEBUG nova.compute.provider_tree [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:01:07 compute-0 nova_compute[192903]: 2025-10-06 14:01:07.180 2 DEBUG nova.scheduler.client.report [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:01:07 compute-0 nova_compute[192903]: 2025-10-06 14:01:07.692 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:07 compute-0 nova_compute[192903]: 2025-10-06 14:01:07.693 2 DEBUG nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:01:08 compute-0 nova_compute[192903]: 2025-10-06 14:01:08.209 2 DEBUG nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:01:08 compute-0 nova_compute[192903]: 2025-10-06 14:01:08.209 2 DEBUG nova.network.neutron [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:01:08 compute-0 nova_compute[192903]: 2025-10-06 14:01:08.210 2 WARNING neutronclient.v2_0.client [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:01:08 compute-0 nova_compute[192903]: 2025-10-06 14:01:08.214 2 WARNING neutronclient.v2_0.client [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:01:08 compute-0 nova_compute[192903]: 2025-10-06 14:01:08.728 2 INFO nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:01:09 compute-0 podman[216455]: 2025-10-06 14:01:09.191903396 +0000 UTC m=+0.056090407 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:01:09 compute-0 podman[216456]: 2025-10-06 14:01:09.215177635 +0000 UTC m=+0.078815602 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:01:09 compute-0 podman[216454]: 2025-10-06 14:01:09.221071255 +0000 UTC m=+0.084415294 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Oct 06 14:01:09 compute-0 nova_compute[192903]: 2025-10-06 14:01:09.243 2 DEBUG nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:01:10 compute-0 nova_compute[192903]: 2025-10-06 14:01:10.269 2 DEBUG nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:01:10 compute-0 nova_compute[192903]: 2025-10-06 14:01:10.271 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:01:10 compute-0 nova_compute[192903]: 2025-10-06 14:01:10.272 2 INFO nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Creating image(s)
Oct 06 14:01:10 compute-0 nova_compute[192903]: 2025-10-06 14:01:10.273 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "/var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:10 compute-0 nova_compute[192903]: 2025-10-06 14:01:10.274 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "/var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:10 compute-0 nova_compute[192903]: 2025-10-06 14:01:10.274 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "/var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:10 compute-0 nova_compute[192903]: 2025-10-06 14:01:10.275 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:10 compute-0 nova_compute[192903]: 2025-10-06 14:01:10.276 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:11.164 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:01:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:11.166 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:01:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:11.350 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:11.350 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:11.351 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:11 compute-0 nova_compute[192903]: 2025-10-06 14:01:11.754 2 DEBUG oslo_utils.imageutils.format_inspector [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:01:11 compute-0 nova_compute[192903]: 2025-10-06 14:01:11.757 2 DEBUG oslo_utils.imageutils.format_inspector [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:01:11 compute-0 nova_compute[192903]: 2025-10-06 14:01:11.757 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:11 compute-0 nova_compute[192903]: 2025-10-06 14:01:11.837 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3.part --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:11 compute-0 nova_compute[192903]: 2025-10-06 14:01:11.837 2 DEBUG nova.virt.images [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] 22f1b7c7-d15f-4caf-8898-de5e10b0ea89 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Oct 06 14:01:11 compute-0 nova_compute[192903]: 2025-10-06 14:01:11.845 2 DEBUG nova.privsep.utils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Oct 06 14:01:11 compute-0 nova_compute[192903]: 2025-10-06 14:01:11.845 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3.part /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:11 compute-0 nova_compute[192903]: 2025-10-06 14:01:11.970 2 DEBUG nova.network.neutron [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Successfully created port: 987cff90-5279-4251-a5f3-2c78272fcd0f _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:01:12 compute-0 nova_compute[192903]: 2025-10-06 14:01:12.766 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3.part /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3.converted" returned: 0 in 0.921s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:12 compute-0 nova_compute[192903]: 2025-10-06 14:01:12.772 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:12 compute-0 nova_compute[192903]: 2025-10-06 14:01:12.832 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3.converted --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:12 compute-0 nova_compute[192903]: 2025-10-06 14:01:12.834 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.558s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:12 compute-0 nova_compute[192903]: 2025-10-06 14:01:12.834 2 DEBUG oslo_utils.imageutils.format_inspector [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:01:12 compute-0 nova_compute[192903]: 2025-10-06 14:01:12.838 2 DEBUG oslo_utils.imageutils.format_inspector [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:01:12 compute-0 nova_compute[192903]: 2025-10-06 14:01:12.841 2 INFO oslo.privsep.daemon [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp81ap7pe8/privsep.sock']
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.552 2 INFO oslo.privsep.daemon [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Spawned new privsep daemon via rootwrap
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.404 68 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.410 68 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.413 68 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.414 68 INFO oslo.privsep.daemon [-] privsep daemon running as pid 68
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.641 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.702 2 DEBUG nova.network.neutron [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Successfully updated port: 987cff90-5279-4251-a5f3-2c78272fcd0f _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.714 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.715 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.716 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.717 2 DEBUG oslo_utils.imageutils.format_inspector [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.724 2 DEBUG oslo_utils.imageutils.format_inspector [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.725 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.771 2 DEBUG nova.compute.manager [req-3404f0d5-e45e-4cae-a8a3-cea5bc50dfe3 req-b315835f-8d1f-4fce-ae36-0903f86f2279 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Received event network-changed-987cff90-5279-4251-a5f3-2c78272fcd0f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.771 2 DEBUG nova.compute.manager [req-3404f0d5-e45e-4cae-a8a3-cea5bc50dfe3 req-b315835f-8d1f-4fce-ae36-0903f86f2279 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Refreshing instance network info cache due to event network-changed-987cff90-5279-4251-a5f3-2c78272fcd0f. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.772 2 DEBUG oslo_concurrency.lockutils [req-3404f0d5-e45e-4cae-a8a3-cea5bc50dfe3 req-b315835f-8d1f-4fce-ae36-0903f86f2279 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-6a6c5614-7397-46ad-923d-8a9d018ab5e4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.772 2 DEBUG oslo_concurrency.lockutils [req-3404f0d5-e45e-4cae-a8a3-cea5bc50dfe3 req-b315835f-8d1f-4fce-ae36-0903f86f2279 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-6a6c5614-7397-46ad-923d-8a9d018ab5e4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.772 2 DEBUG nova.network.neutron [req-3404f0d5-e45e-4cae-a8a3-cea5bc50dfe3 req-b315835f-8d1f-4fce-ae36-0903f86f2279 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Refreshing network info cache for port 987cff90-5279-4251-a5f3-2c78272fcd0f _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.782 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.783 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.856 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk 1073741824" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.857 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.857 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.918 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.919 2 DEBUG nova.virt.disk.api [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Checking if we can resize image /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:01:13 compute-0 nova_compute[192903]: 2025-10-06 14:01:13.919 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.003 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.004 2 DEBUG nova.virt.disk.api [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Cannot resize image /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.005 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.005 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Ensure instance console log exists: /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.006 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.006 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.006 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:14.168 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.237 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "refresh_cache-6a6c5614-7397-46ad-923d-8a9d018ab5e4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.293 2 WARNING neutronclient.v2_0.client [req-3404f0d5-e45e-4cae-a8a3-cea5bc50dfe3 req-b315835f-8d1f-4fce-ae36-0903f86f2279 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.692 2 DEBUG nova.network.neutron [req-3404f0d5-e45e-4cae-a8a3-cea5bc50dfe3 req-b315835f-8d1f-4fce-ae36-0903f86f2279 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:01:14 compute-0 nova_compute[192903]: 2025-10-06 14:01:14.810 2 DEBUG nova.network.neutron [req-3404f0d5-e45e-4cae-a8a3-cea5bc50dfe3 req-b315835f-8d1f-4fce-ae36-0903f86f2279 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:01:15 compute-0 nova_compute[192903]: 2025-10-06 14:01:15.336 2 DEBUG oslo_concurrency.lockutils [req-3404f0d5-e45e-4cae-a8a3-cea5bc50dfe3 req-b315835f-8d1f-4fce-ae36-0903f86f2279 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-6a6c5614-7397-46ad-923d-8a9d018ab5e4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:01:15 compute-0 nova_compute[192903]: 2025-10-06 14:01:15.337 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquired lock "refresh_cache-6a6c5614-7397-46ad-923d-8a9d018ab5e4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:01:15 compute-0 nova_compute[192903]: 2025-10-06 14:01:15.338 2 DEBUG nova.network.neutron [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:01:16 compute-0 nova_compute[192903]: 2025-10-06 14:01:16.268 2 DEBUG nova.network.neutron [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:01:16 compute-0 nova_compute[192903]: 2025-10-06 14:01:16.496 2 WARNING neutronclient.v2_0.client [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:01:16 compute-0 nova_compute[192903]: 2025-10-06 14:01:16.637 2 DEBUG nova.network.neutron [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Updating instance_info_cache with network_info: [{"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.154 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Releasing lock "refresh_cache-6a6c5614-7397-46ad-923d-8a9d018ab5e4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.155 2 DEBUG nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Instance network_info: |[{"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.159 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Start _get_guest_xml network_info=[{"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.165 2 WARNING nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.168 2 DEBUG nova.virt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-1020838277', uuid='6a6c5614-7397-46ad-923d-8a9d018ab5e4'), owner=OwnerMeta(userid='841ae4d9261e45a6919d04253e085a88', username='tempest-TestDataModel-1431028624-project-admin', projectid='2fb66390c2444d9f9fb655ec6a836510', projectname='tempest-TestDataModel-1431028624'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759759277.1683645) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.219 2 DEBUG nova.virt.libvirt.host [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.220 2 DEBUG nova.virt.libvirt.host [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.225 2 DEBUG nova.virt.libvirt.host [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.226 2 DEBUG nova.virt.libvirt.host [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.227 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.227 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.228 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.229 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.229 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.229 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.230 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.230 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.231 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.231 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.231 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.232 2 DEBUG nova.virt.hardware [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.237 2 DEBUG nova.privsep.utils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.239 2 DEBUG nova.virt.libvirt.vif [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:01:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1020838277',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1020838277',id=3,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2fb66390c2444d9f9fb655ec6a836510',ramdisk_id='',reservation_id='r-hga38hxh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-1431028624',owner_user_name='tempest-TestDataModel-1431028624-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:01:09Z,user_data=None,user_id='841ae4d9261e45a6919d04253e085a88',uuid=6a6c5614-7397-46ad-923d-8a9d018ab5e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.240 2 DEBUG nova.network.os_vif_util [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Converting VIF {"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.241 2 DEBUG nova.network.os_vif_util [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:79:d5,bridge_name='br-int',has_traffic_filtering=True,id=987cff90-5279-4251-a5f3-2c78272fcd0f,network=Network(79a9c966-c2e3-4299-a86c-5b892b2c16bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap987cff90-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.243 2 DEBUG nova.objects.instance [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6a6c5614-7397-46ad-923d-8a9d018ab5e4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.751 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <uuid>6a6c5614-7397-46ad-923d-8a9d018ab5e4</uuid>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <name>instance-00000003</name>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <nova:name>tempest-TestDataModel-server-1020838277</nova:name>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:01:17</nova:creationTime>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:01:17 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:01:17 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:user uuid="841ae4d9261e45a6919d04253e085a88">tempest-TestDataModel-1431028624-project-admin</nova:user>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:project uuid="2fb66390c2444d9f9fb655ec6a836510">tempest-TestDataModel-1431028624</nova:project>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         <nova:port uuid="987cff90-5279-4251-a5f3-2c78272fcd0f">
Oct 06 14:01:17 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <system>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <entry name="serial">6a6c5614-7397-46ad-923d-8a9d018ab5e4</entry>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <entry name="uuid">6a6c5614-7397-46ad-923d-8a9d018ab5e4</entry>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     </system>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <os>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   </os>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <features>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   </features>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk.config"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:a1:79:d5"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <target dev="tap987cff90-52"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/console.log" append="off"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <video>
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     </video>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:01:17 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:01:17 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:01:17 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:01:17 compute-0 nova_compute[192903]: </domain>
Oct 06 14:01:17 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.753 2 DEBUG nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Preparing to wait for external event network-vif-plugged-987cff90-5279-4251-a5f3-2c78272fcd0f prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.753 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.754 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.754 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.755 2 DEBUG nova.virt.libvirt.vif [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:01:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1020838277',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1020838277',id=3,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2fb66390c2444d9f9fb655ec6a836510',ramdisk_id='',reservation_id='r-hga38hxh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-1431028624',owner_user_name='tempest-TestDataModel-1431028624-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:01:09Z,user_data=None,user_id='841ae4d9261e45a6919d04253e085a88',uuid=6a6c5614-7397-46ad-923d-8a9d018ab5e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.756 2 DEBUG nova.network.os_vif_util [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Converting VIF {"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.757 2 DEBUG nova.network.os_vif_util [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:79:d5,bridge_name='br-int',has_traffic_filtering=True,id=987cff90-5279-4251-a5f3-2c78272fcd0f,network=Network(79a9c966-c2e3-4299-a86c-5b892b2c16bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap987cff90-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.758 2 DEBUG os_vif [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:79:d5,bridge_name='br-int',has_traffic_filtering=True,id=987cff90-5279-4251-a5f3-2c78272fcd0f,network=Network(79a9c966-c2e3-4299-a86c-5b892b2c16bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap987cff90-52') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.813 2 DEBUG ovsdbapp.backend.ovs_idl [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.813 2 DEBUG ovsdbapp.backend.ovs_idl [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.813 2 DEBUG ovsdbapp.backend.ovs_idl [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.828 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.828 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0148a58a-df0b-52fe-ba98-3efe6e734e77', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:17 compute-0 nova_compute[192903]: 2025-10-06 14:01:17.833 2 INFO oslo.privsep.daemon [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpy2l4y2lv/privsep.sock']
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:18 compute-0 podman[216555]: 2025-10-06 14:01:18.21891839 +0000 UTC m=+0.076308825 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.635 2 INFO oslo.privsep.daemon [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Spawned new privsep daemon via rootwrap
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.484 89 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.491 89 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.496 89 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.496 89 INFO oslo.privsep.daemon [-] privsep daemon running as pid 89
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.901 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap987cff90-52, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.901 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap987cff90-52, col_values=(('qos', UUID('b7ac8c45-83d6-412d-b265-a307ddb33c4a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.902 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap987cff90-52, col_values=(('external_ids', {'iface-id': '987cff90-5279-4251-a5f3-2c78272fcd0f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a1:79:d5', 'vm-uuid': '6a6c5614-7397-46ad-923d-8a9d018ab5e4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:01:18 compute-0 NetworkManager[52035]: <info>  [1759759278.9062] manager: (tap987cff90-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:18 compute-0 nova_compute[192903]: 2025-10-06 14:01:18.915 2 INFO os_vif [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:79:d5,bridge_name='br-int',has_traffic_filtering=True,id=987cff90-5279-4251-a5f3-2c78272fcd0f,network=Network(79a9c966-c2e3-4299-a86c-5b892b2c16bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap987cff90-52')
Oct 06 14:01:20 compute-0 nova_compute[192903]: 2025-10-06 14:01:20.454 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:01:20 compute-0 nova_compute[192903]: 2025-10-06 14:01:20.455 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:01:20 compute-0 nova_compute[192903]: 2025-10-06 14:01:20.455 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] No VIF found with MAC fa:16:3e:a1:79:d5, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:01:20 compute-0 nova_compute[192903]: 2025-10-06 14:01:20.456 2 INFO nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Using config drive
Oct 06 14:01:20 compute-0 nova_compute[192903]: 2025-10-06 14:01:20.967 2 WARNING neutronclient.v2_0.client [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:01:21 compute-0 podman[216582]: 2025-10-06 14:01:21.211090402 +0000 UTC m=+0.072919162 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 06 14:01:21 compute-0 nova_compute[192903]: 2025-10-06 14:01:21.784 2 INFO nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Creating config drive at /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk.config
Oct 06 14:01:21 compute-0 nova_compute[192903]: 2025-10-06 14:01:21.789 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpcde2htp3 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:21 compute-0 nova_compute[192903]: 2025-10-06 14:01:21.914 2 DEBUG oslo_concurrency.processutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpcde2htp3" returned: 0 in 0.125s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:21 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct 06 14:01:22 compute-0 kernel: tap987cff90-52: entered promiscuous mode
Oct 06 14:01:22 compute-0 NetworkManager[52035]: <info>  [1759759282.0042] manager: (tap987cff90-52): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Oct 06 14:01:22 compute-0 ovn_controller[95205]: 2025-10-06T14:01:22Z|00040|binding|INFO|Claiming lport 987cff90-5279-4251-a5f3-2c78272fcd0f for this chassis.
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:22 compute-0 ovn_controller[95205]: 2025-10-06T14:01:22Z|00041|binding|INFO|987cff90-5279-4251-a5f3-2c78272fcd0f: Claiming fa:16:3e:a1:79:d5 10.100.0.5
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.021 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:79:d5 10.100.0.5'], port_security=['fa:16:3e:a1:79:d5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6a6c5614-7397-46ad-923d-8a9d018ab5e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2fb66390c2444d9f9fb655ec6a836510', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e33db856-d640-4027-b460-a6038bb3bcaa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51f46be0-262e-47b6-a191-ce233c22ccf9, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=987cff90-5279-4251-a5f3-2c78272fcd0f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.022 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 987cff90-5279-4251-a5f3-2c78272fcd0f in datapath 79a9c966-c2e3-4299-a86c-5b892b2c16bc bound to our chassis
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.024 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79a9c966-c2e3-4299-a86c-5b892b2c16bc
Oct 06 14:01:22 compute-0 systemd-udevd[216627]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.050 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[588b89bd-8557-4faf-a4cc-cca2c9947712]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.051 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap79a9c966-c1 in ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.053 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap79a9c966-c0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.054 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[83a67fc8-74fb-4336-9a17-4479b0bd9009]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.055 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5c534b-a236-4e2b-a36c-46e42b3a3007]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:22 compute-0 NetworkManager[52035]: <info>  [1759759282.0719] device (tap987cff90-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:01:22 compute-0 NetworkManager[52035]: <info>  [1759759282.0730] device (tap987cff90-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.072 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[51868056-e69e-47f6-9948-3251cb9daca7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:22 compute-0 systemd-machined[152985]: New machine qemu-1-instance-00000003.
Oct 06 14:01:22 compute-0 ovn_controller[95205]: 2025-10-06T14:01:22Z|00042|binding|INFO|Setting lport 987cff90-5279-4251-a5f3-2c78272fcd0f ovn-installed in OVS
Oct 06 14:01:22 compute-0 ovn_controller[95205]: 2025-10-06T14:01:22Z|00043|binding|INFO|Setting lport 987cff90-5279-4251-a5f3-2c78272fcd0f up in Southbound
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.091 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a4bf86bb-24c1-49b1-ab5d-5c5b21c4139a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.092 104072 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmprgkkhj8h/privsep.sock']
Oct 06 14:01:22 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.788 2 DEBUG nova.compute.manager [req-0717d08a-253b-4f19-92d3-7a01040cdb04 req-da8b2361-59c9-4b9b-bdf0-4573b45f7742 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Received event network-vif-plugged-987cff90-5279-4251-a5f3-2c78272fcd0f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.788 2 DEBUG oslo_concurrency.lockutils [req-0717d08a-253b-4f19-92d3-7a01040cdb04 req-da8b2361-59c9-4b9b-bdf0-4573b45f7742 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.789 2 DEBUG oslo_concurrency.lockutils [req-0717d08a-253b-4f19-92d3-7a01040cdb04 req-da8b2361-59c9-4b9b-bdf0-4573b45f7742 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.789 2 DEBUG oslo_concurrency.lockutils [req-0717d08a-253b-4f19-92d3-7a01040cdb04 req-da8b2361-59c9-4b9b-bdf0-4573b45f7742 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.790 2 DEBUG nova.compute.manager [req-0717d08a-253b-4f19-92d3-7a01040cdb04 req-da8b2361-59c9-4b9b-bdf0-4573b45f7742 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Processing event network-vif-plugged-987cff90-5279-4251-a5f3-2c78272fcd0f _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.844 104072 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.845 104072 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmprgkkhj8h/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.722 216656 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.727 216656 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.730 216656 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.730 216656 INFO oslo.privsep.daemon [-] privsep daemon running as pid 216656
Oct 06 14:01:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:22.847 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[dc96ecae-ecbd-41aa-8b35-9961f98c8dc2]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.956 2 DEBUG nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.968 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.972 2 INFO nova.virt.libvirt.driver [-] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Instance spawned successfully.
Oct 06 14:01:22 compute-0 nova_compute[192903]: 2025-10-06 14:01:22.973 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:01:23 compute-0 nova_compute[192903]: 2025-10-06 14:01:23.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.363 216656 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.363 216656 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.364 216656 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:23 compute-0 nova_compute[192903]: 2025-10-06 14:01:23.491 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:01:23 compute-0 nova_compute[192903]: 2025-10-06 14:01:23.493 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:01:23 compute-0 nova_compute[192903]: 2025-10-06 14:01:23.494 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:01:23 compute-0 nova_compute[192903]: 2025-10-06 14:01:23.495 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:01:23 compute-0 nova_compute[192903]: 2025-10-06 14:01:23.496 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:01:23 compute-0 nova_compute[192903]: 2025-10-06 14:01:23.497 2 DEBUG nova.virt.libvirt.driver [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.790 216656 INFO oslo_service.backend [-] Loading backend: eventlet
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.795 216656 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.859 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[56473227-8fcd-424d-8490-17372a65b281]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:23 compute-0 NetworkManager[52035]: <info>  [1759759283.8657] manager: (tap79a9c966-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.866 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7b5751-1081-48ec-9061-84a37e492830]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:23 compute-0 systemd-udevd[216632]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.897 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[4da919f0-877c-4591-abfc-e8205b6280be]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.900 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ee7863-f75d-45dc-b662-158994aa3c04]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:23 compute-0 nova_compute[192903]: 2025-10-06 14:01:23.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:23 compute-0 NetworkManager[52035]: <info>  [1759759283.9274] device (tap79a9c966-c0): carrier: link connected
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.935 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[7142ab9c-0ed4-4f31-9ee6-bdcaaf3b2e6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.965 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a5be7cba-ff1b-4b00-87d2-e2474796a325]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79a9c966-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b0:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374557, 'reachable_time': 31507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216678, 'error': None, 'target': 'ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:23.983 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a1169a2f-1632-41ae-86d1-9cbf278a104a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:b09e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 374557, 'tstamp': 374557}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216679, 'error': None, 'target': 'ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.000 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7ea8567a-e7c8-4f55-aa08-353db229c332]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79a9c966-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:b0:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374557, 'reachable_time': 31507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216680, 'error': None, 'target': 'ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.009 2 INFO nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Took 13.74 seconds to spawn the instance on the hypervisor.
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.011 2 DEBUG nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.039 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4f012d35-e623-4749-a723-9b6ebd7c3c1e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.114 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[d10985db-f9d6-44ef-8039-5eea91fae6d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.115 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79a9c966-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.116 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.116 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79a9c966-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:24 compute-0 kernel: tap79a9c966-c0: entered promiscuous mode
Oct 06 14:01:24 compute-0 NetworkManager[52035]: <info>  [1759759284.1190] manager: (tap79a9c966-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.121 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79a9c966-c0, col_values=(('external_ids', {'iface-id': 'ab1436cf-cf66-4156-9fc9-6221146f6d00'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:24 compute-0 ovn_controller[95205]: 2025-10-06T14:01:24Z|00044|binding|INFO|Releasing lport ab1436cf-cf66-4156-9fc9-6221146f6d00 from this chassis (sb_readonly=0)
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.137 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[f742bef2-3139-404c-a9f3-9401b3d9adcb]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.138 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.138 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.138 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 79a9c966-c2e3-4299-a86c-5b892b2c16bc disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.139 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.140 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b39204-a48b-4155-afa3-eba5cb6747ea]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.141 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.141 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[737c7b46-3e0a-4c4c-b0fe-c45a76790dde]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.142 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-79a9c966-c2e3-4299-a86c-5b892b2c16bc
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 79a9c966-c2e3-4299-a86c-5b892b2c16bc
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:01:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:24.145 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'env', 'PROCESS_TAG=haproxy-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/79a9c966-c2e3-4299-a86c-5b892b2c16bc.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.547 2 INFO nova.compute.manager [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Took 19.06 seconds to build instance.
Oct 06 14:01:24 compute-0 podman[216713]: 2025-10-06 14:01:24.568813999 +0000 UTC m=+0.048648657 container create 82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Oct 06 14:01:24 compute-0 systemd[1]: Started libpod-conmon-82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266.scope.
Oct 06 14:01:24 compute-0 podman[216713]: 2025-10-06 14:01:24.546752862 +0000 UTC m=+0.026587540 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:01:24 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:01:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c291a81f650c86e1361256e71f94a01497272dfb7360e7fab3988b9251ccb298/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:01:24 compute-0 podman[216713]: 2025-10-06 14:01:24.670640472 +0000 UTC m=+0.150475210 container init 82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Oct 06 14:01:24 compute-0 podman[216713]: 2025-10-06 14:01:24.677263591 +0000 UTC m=+0.157098289 container start 82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 06 14:01:24 compute-0 neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc[216729]: [NOTICE]   (216733) : New worker (216735) forked
Oct 06 14:01:24 compute-0 neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc[216729]: [NOTICE]   (216733) : Loading success.
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.852 2 DEBUG nova.compute.manager [req-ea171774-554b-4b12-ad23-00b4fcd59a14 req-56e9b387-e5b0-44e2-ba18-145f8e2782f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Received event network-vif-plugged-987cff90-5279-4251-a5f3-2c78272fcd0f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.854 2 DEBUG oslo_concurrency.lockutils [req-ea171774-554b-4b12-ad23-00b4fcd59a14 req-56e9b387-e5b0-44e2-ba18-145f8e2782f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.855 2 DEBUG oslo_concurrency.lockutils [req-ea171774-554b-4b12-ad23-00b4fcd59a14 req-56e9b387-e5b0-44e2-ba18-145f8e2782f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.855 2 DEBUG oslo_concurrency.lockutils [req-ea171774-554b-4b12-ad23-00b4fcd59a14 req-56e9b387-e5b0-44e2-ba18-145f8e2782f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.856 2 DEBUG nova.compute.manager [req-ea171774-554b-4b12-ad23-00b4fcd59a14 req-56e9b387-e5b0-44e2-ba18-145f8e2782f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] No waiting events found dispatching network-vif-plugged-987cff90-5279-4251-a5f3-2c78272fcd0f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:01:24 compute-0 nova_compute[192903]: 2025-10-06 14:01:24.856 2 WARNING nova.compute.manager [req-ea171774-554b-4b12-ad23-00b4fcd59a14 req-56e9b387-e5b0-44e2-ba18-145f8e2782f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Received unexpected event network-vif-plugged-987cff90-5279-4251-a5f3-2c78272fcd0f for instance with vm_state active and task_state None.
Oct 06 14:01:25 compute-0 nova_compute[192903]: 2025-10-06 14:01:25.053 2 DEBUG oslo_concurrency.lockutils [None req-beb9dbae-332d-469f-94e4-2990b39997a4 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.600s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:27 compute-0 nova_compute[192903]: 2025-10-06 14:01:27.482 2 DEBUG oslo_concurrency.lockutils [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:27 compute-0 nova_compute[192903]: 2025-10-06 14:01:27.482 2 DEBUG oslo_concurrency.lockutils [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:27 compute-0 nova_compute[192903]: 2025-10-06 14:01:27.483 2 DEBUG oslo_concurrency.lockutils [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:27 compute-0 nova_compute[192903]: 2025-10-06 14:01:27.483 2 DEBUG oslo_concurrency.lockutils [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:27 compute-0 nova_compute[192903]: 2025-10-06 14:01:27.483 2 DEBUG oslo_concurrency.lockutils [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:27 compute-0 nova_compute[192903]: 2025-10-06 14:01:27.494 2 INFO nova.compute.manager [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Terminating instance
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.016 2 DEBUG nova.compute.manager [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:01:28 compute-0 kernel: tap987cff90-52 (unregistering): left promiscuous mode
Oct 06 14:01:28 compute-0 NetworkManager[52035]: <info>  [1759759288.0501] device (tap987cff90-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:01:28 compute-0 ovn_controller[95205]: 2025-10-06T14:01:28Z|00045|binding|INFO|Releasing lport 987cff90-5279-4251-a5f3-2c78272fcd0f from this chassis (sb_readonly=0)
Oct 06 14:01:28 compute-0 ovn_controller[95205]: 2025-10-06T14:01:28Z|00046|binding|INFO|Setting lport 987cff90-5279-4251-a5f3-2c78272fcd0f down in Southbound
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 ovn_controller[95205]: 2025-10-06T14:01:28Z|00047|binding|INFO|Removing iface tap987cff90-52 ovn-installed in OVS
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.069 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:79:d5 10.100.0.5'], port_security=['fa:16:3e:a1:79:d5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6a6c5614-7397-46ad-923d-8a9d018ab5e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2fb66390c2444d9f9fb655ec6a836510', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e33db856-d640-4027-b460-a6038bb3bcaa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51f46be0-262e-47b6-a191-ce233c22ccf9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=987cff90-5279-4251-a5f3-2c78272fcd0f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.071 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 987cff90-5279-4251-a5f3-2c78272fcd0f in datapath 79a9c966-c2e3-4299-a86c-5b892b2c16bc unbound from our chassis
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.072 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79a9c966-c2e3-4299-a86c-5b892b2c16bc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.074 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3bec32-7118-4671-ad13-68ab4a28434a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.075 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc namespace which is not needed anymore
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct 06 14:01:28 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 5.963s CPU time.
Oct 06 14:01:28 compute-0 systemd-machined[152985]: Machine qemu-1-instance-00000003 terminated.
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc[216729]: [NOTICE]   (216733) : haproxy version is 3.0.5-8e879a5
Oct 06 14:01:28 compute-0 podman[216769]: 2025-10-06 14:01:28.243728961 +0000 UTC m=+0.038689427 container kill 82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:01:28 compute-0 neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc[216729]: [NOTICE]   (216733) : path to executable is /usr/sbin/haproxy
Oct 06 14:01:28 compute-0 neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc[216729]: [WARNING]  (216733) : Exiting Master process...
Oct 06 14:01:28 compute-0 neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc[216729]: [ALERT]    (216733) : Current worker (216735) exited with code 143 (Terminated)
Oct 06 14:01:28 compute-0 neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc[216729]: [WARNING]  (216733) : All workers exited. Exiting... (0)
Oct 06 14:01:28 compute-0 systemd[1]: libpod-82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266.scope: Deactivated successfully.
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.286 2 INFO nova.virt.libvirt.driver [-] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Instance destroyed successfully.
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.287 2 DEBUG nova.objects.instance [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lazy-loading 'resources' on Instance uuid 6a6c5614-7397-46ad-923d-8a9d018ab5e4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:01:28 compute-0 podman[216790]: 2025-10-06 14:01:28.309464009 +0000 UTC m=+0.040238249 container died 82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:01:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266-userdata-shm.mount: Deactivated successfully.
Oct 06 14:01:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-c291a81f650c86e1361256e71f94a01497272dfb7360e7fab3988b9251ccb298-merged.mount: Deactivated successfully.
Oct 06 14:01:28 compute-0 podman[216790]: 2025-10-06 14:01:28.348367731 +0000 UTC m=+0.079141961 container cleanup 82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:01:28 compute-0 systemd[1]: libpod-conmon-82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266.scope: Deactivated successfully.
Oct 06 14:01:28 compute-0 podman[216798]: 2025-10-06 14:01:28.369120372 +0000 UTC m=+0.073659923 container remove 82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.392 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4190bf39-a5a2-41db-9677-eb37ceb9a19a]: (4, ("Mon Oct  6 02:01:28 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc (82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266)\n82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266\nMon Oct  6 02:01:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc (82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266)\n82bc8cf384d859ec59acc977766fbdca2c646888df3868ab532f7f142d4da266\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.394 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1371d760-4491-4764-a15c-7c5b4ec18156]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.394 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79a9c966-c2e3-4299-a86c-5b892b2c16bc.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.395 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0e93c6-d913-4a11-813b-29d904582007]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.396 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79a9c966-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 kernel: tap79a9c966-c0: left promiscuous mode
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.425 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[90a2db4a-bad6-4804-9d21-1b409f9f47ef]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.465 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[19680706-9742-4d77-b684-80d80a562147]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.466 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[59516f88-0257-403a-adc8-d17c6262e8a6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.491 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bbee0f3c-e1ad-4b41-80d3-5e590fcdfe6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374549, 'reachable_time': 17209, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216834, 'error': None, 'target': 'ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.496 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-79a9c966-c2e3-4299-a86c-5b892b2c16bc deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:01:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:28.497 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[be47c6a9-20c4-4c97-8b12-10c7f58c49c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:01:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d79a9c966\x2dc2e3\x2d4299\x2da86c\x2d5b892b2c16bc.mount: Deactivated successfully.
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.775 2 DEBUG nova.compute.manager [req-4212486f-fc8d-432d-a73a-4f05781f9682 req-054657a2-ae99-4af6-918f-5ea4edc4bfb7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Received event network-vif-unplugged-987cff90-5279-4251-a5f3-2c78272fcd0f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.776 2 DEBUG oslo_concurrency.lockutils [req-4212486f-fc8d-432d-a73a-4f05781f9682 req-054657a2-ae99-4af6-918f-5ea4edc4bfb7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.776 2 DEBUG oslo_concurrency.lockutils [req-4212486f-fc8d-432d-a73a-4f05781f9682 req-054657a2-ae99-4af6-918f-5ea4edc4bfb7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.777 2 DEBUG oslo_concurrency.lockutils [req-4212486f-fc8d-432d-a73a-4f05781f9682 req-054657a2-ae99-4af6-918f-5ea4edc4bfb7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.777 2 DEBUG nova.compute.manager [req-4212486f-fc8d-432d-a73a-4f05781f9682 req-054657a2-ae99-4af6-918f-5ea4edc4bfb7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] No waiting events found dispatching network-vif-unplugged-987cff90-5279-4251-a5f3-2c78272fcd0f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.777 2 DEBUG nova.compute.manager [req-4212486f-fc8d-432d-a73a-4f05781f9682 req-054657a2-ae99-4af6-918f-5ea4edc4bfb7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Received event network-vif-unplugged-987cff90-5279-4251-a5f3-2c78272fcd0f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.794 2 DEBUG nova.virt.libvirt.vif [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:01:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1020838277',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1020838277',id=3,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:01:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2fb66390c2444d9f9fb655ec6a836510',ramdisk_id='',reservation_id='r-hga38hxh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-1431028624',owner_user_name='tempest-TestDataModel-1431028624-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:01:24Z,user_data=None,user_id='841ae4d9261e45a6919d04253e085a88',uuid=6a6c5614-7397-46ad-923d-8a9d018ab5e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.794 2 DEBUG nova.network.os_vif_util [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Converting VIF {"id": "987cff90-5279-4251-a5f3-2c78272fcd0f", "address": "fa:16:3e:a1:79:d5", "network": {"id": "79a9c966-c2e3-4299-a86c-5b892b2c16bc", "bridge": "br-int", "label": "tempest-TestDataModel-162226600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5bca070f26184ccf81c59294881a8fb1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap987cff90-52", "ovs_interfaceid": "987cff90-5279-4251-a5f3-2c78272fcd0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.795 2 DEBUG nova.network.os_vif_util [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:79:d5,bridge_name='br-int',has_traffic_filtering=True,id=987cff90-5279-4251-a5f3-2c78272fcd0f,network=Network(79a9c966-c2e3-4299-a86c-5b892b2c16bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap987cff90-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.795 2 DEBUG os_vif [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:79:d5,bridge_name='br-int',has_traffic_filtering=True,id=987cff90-5279-4251-a5f3-2c78272fcd0f,network=Network(79a9c966-c2e3-4299-a86c-5b892b2c16bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap987cff90-52') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.799 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap987cff90-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.803 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b7ac8c45-83d6-412d-b265-a307ddb33c4a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.808 2 INFO os_vif [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:79:d5,bridge_name='br-int',has_traffic_filtering=True,id=987cff90-5279-4251-a5f3-2c78272fcd0f,network=Network(79a9c966-c2e3-4299-a86c-5b892b2c16bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap987cff90-52')
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.808 2 INFO nova.virt.libvirt.driver [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Deleting instance files /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4_del
Oct 06 14:01:28 compute-0 nova_compute[192903]: 2025-10-06 14:01:28.809 2 INFO nova.virt.libvirt.driver [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Deletion of /var/lib/nova/instances/6a6c5614-7397-46ad-923d-8a9d018ab5e4_del complete
Oct 06 14:01:29 compute-0 nova_compute[192903]: 2025-10-06 14:01:29.322 2 INFO nova.compute.manager [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 06 14:01:29 compute-0 nova_compute[192903]: 2025-10-06 14:01:29.324 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:01:29 compute-0 nova_compute[192903]: 2025-10-06 14:01:29.324 2 DEBUG nova.compute.manager [-] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:01:29 compute-0 nova_compute[192903]: 2025-10-06 14:01:29.324 2 DEBUG nova.network.neutron [-] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:01:29 compute-0 nova_compute[192903]: 2025-10-06 14:01:29.325 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:01:29 compute-0 nova_compute[192903]: 2025-10-06 14:01:29.690 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:01:29 compute-0 podman[203308]: time="2025-10-06T14:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:01:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:01:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2995 "" "Go-http-client/1.1"
Oct 06 14:01:30 compute-0 nova_compute[192903]: 2025-10-06 14:01:30.442 2 DEBUG nova.network.neutron [-] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:01:30 compute-0 nova_compute[192903]: 2025-10-06 14:01:30.844 2 DEBUG nova.compute.manager [req-2d29d58d-7996-40d2-a0aa-37c2f3deb03b req-31c6c3d7-9fb3-4ceb-8a3f-3f7ec75616d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Received event network-vif-unplugged-987cff90-5279-4251-a5f3-2c78272fcd0f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:01:30 compute-0 nova_compute[192903]: 2025-10-06 14:01:30.845 2 DEBUG oslo_concurrency.lockutils [req-2d29d58d-7996-40d2-a0aa-37c2f3deb03b req-31c6c3d7-9fb3-4ceb-8a3f-3f7ec75616d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:30 compute-0 nova_compute[192903]: 2025-10-06 14:01:30.845 2 DEBUG oslo_concurrency.lockutils [req-2d29d58d-7996-40d2-a0aa-37c2f3deb03b req-31c6c3d7-9fb3-4ceb-8a3f-3f7ec75616d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:30 compute-0 nova_compute[192903]: 2025-10-06 14:01:30.846 2 DEBUG oslo_concurrency.lockutils [req-2d29d58d-7996-40d2-a0aa-37c2f3deb03b req-31c6c3d7-9fb3-4ceb-8a3f-3f7ec75616d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:30 compute-0 nova_compute[192903]: 2025-10-06 14:01:30.846 2 DEBUG nova.compute.manager [req-2d29d58d-7996-40d2-a0aa-37c2f3deb03b req-31c6c3d7-9fb3-4ceb-8a3f-3f7ec75616d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] No waiting events found dispatching network-vif-unplugged-987cff90-5279-4251-a5f3-2c78272fcd0f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:01:30 compute-0 nova_compute[192903]: 2025-10-06 14:01:30.846 2 DEBUG nova.compute.manager [req-2d29d58d-7996-40d2-a0aa-37c2f3deb03b req-31c6c3d7-9fb3-4ceb-8a3f-3f7ec75616d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Received event network-vif-unplugged-987cff90-5279-4251-a5f3-2c78272fcd0f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:01:30 compute-0 nova_compute[192903]: 2025-10-06 14:01:30.846 2 DEBUG nova.compute.manager [req-2d29d58d-7996-40d2-a0aa-37c2f3deb03b req-31c6c3d7-9fb3-4ceb-8a3f-3f7ec75616d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Received event network-vif-deleted-987cff90-5279-4251-a5f3-2c78272fcd0f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:01:30 compute-0 nova_compute[192903]: 2025-10-06 14:01:30.948 2 INFO nova.compute.manager [-] [instance: 6a6c5614-7397-46ad-923d-8a9d018ab5e4] Took 1.62 seconds to deallocate network for instance.
Oct 06 14:01:31 compute-0 openstack_network_exporter[205500]: ERROR   14:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:01:31 compute-0 openstack_network_exporter[205500]: ERROR   14:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:01:31 compute-0 openstack_network_exporter[205500]: ERROR   14:01:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:01:31 compute-0 openstack_network_exporter[205500]: ERROR   14:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:01:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:01:31 compute-0 openstack_network_exporter[205500]: ERROR   14:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:01:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:01:31 compute-0 nova_compute[192903]: 2025-10-06 14:01:31.470 2 DEBUG oslo_concurrency.lockutils [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:31 compute-0 nova_compute[192903]: 2025-10-06 14:01:31.470 2 DEBUG oslo_concurrency.lockutils [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:31 compute-0 nova_compute[192903]: 2025-10-06 14:01:31.524 2 DEBUG nova.compute.provider_tree [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.051 2 ERROR nova.scheduler.client.report [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] [req-ae242745-dc18-4aae-8e1b-fda7f02183ec] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 603c9dc2-ee32-4e36-82be-dcfb995e2be1.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-ae242745-dc18-4aae-8e1b-fda7f02183ec"}]}
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.070 2 DEBUG nova.scheduler.client.report [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.084 2 DEBUG nova.scheduler.client.report [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.085 2 DEBUG nova.compute.provider_tree [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.103 2 DEBUG nova.scheduler.client.report [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.120 2 DEBUG nova.scheduler.client.report [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.156 2 DEBUG nova.compute.provider_tree [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.702 2 DEBUG nova.scheduler.client.report [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Updated inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.703 2 DEBUG nova.compute.provider_tree [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Updating resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 06 14:01:32 compute-0 nova_compute[192903]: 2025-10-06 14:01:32.703 2 DEBUG nova.compute.provider_tree [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:01:33 compute-0 nova_compute[192903]: 2025-10-06 14:01:33.214 2 DEBUG oslo_concurrency.lockutils [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.743s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:33 compute-0 nova_compute[192903]: 2025-10-06 14:01:33.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:33 compute-0 nova_compute[192903]: 2025-10-06 14:01:33.252 2 INFO nova.scheduler.client.report [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Deleted allocations for instance 6a6c5614-7397-46ad-923d-8a9d018ab5e4
Oct 06 14:01:33 compute-0 nova_compute[192903]: 2025-10-06 14:01:33.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:34 compute-0 nova_compute[192903]: 2025-10-06 14:01:34.282 2 DEBUG oslo_concurrency.lockutils [None req-6698ef8e-d208-481b-811e-b4806d8464fc 841ae4d9261e45a6919d04253e085a88 2fb66390c2444d9f9fb655ec6a836510 - - default default] Lock "6a6c5614-7397-46ad-923d-8a9d018ab5e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.799s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:37 compute-0 podman[216837]: 2025-10-06 14:01:37.21510593 +0000 UTC m=+0.070458106 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:01:38 compute-0 nova_compute[192903]: 2025-10-06 14:01:38.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:38 compute-0 nova_compute[192903]: 2025-10-06 14:01:38.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:40 compute-0 podman[216867]: 2025-10-06 14:01:40.206675606 +0000 UTC m=+0.054414082 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 06 14:01:40 compute-0 podman[216865]: 2025-10-06 14:01:40.22568924 +0000 UTC m=+0.083640322 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 14:01:40 compute-0 podman[216864]: 2025-10-06 14:01:40.235349771 +0000 UTC m=+0.097216939 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller)
Oct 06 14:01:41 compute-0 nova_compute[192903]: 2025-10-06 14:01:41.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:01:43 compute-0 nova_compute[192903]: 2025-10-06 14:01:43.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:43 compute-0 nova_compute[192903]: 2025-10-06 14:01:43.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:01:43 compute-0 nova_compute[192903]: 2025-10-06 14:01:43.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:01:43 compute-0 nova_compute[192903]: 2025-10-06 14:01:43.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:44 compute-0 nova_compute[192903]: 2025-10-06 14:01:44.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:01:44 compute-0 nova_compute[192903]: 2025-10-06 14:01:44.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:01:45 compute-0 nova_compute[192903]: 2025-10-06 14:01:45.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:45 compute-0 nova_compute[192903]: 2025-10-06 14:01:45.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.099 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.318 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.319 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.352 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.353 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=73.30627822875977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.354 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:01:46 compute-0 nova_compute[192903]: 2025-10-06 14:01:46.354 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:01:47 compute-0 nova_compute[192903]: 2025-10-06 14:01:47.404 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:01:47 compute-0 nova_compute[192903]: 2025-10-06 14:01:47.405 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:01:46 up  1:02,  0 user,  load average: 0.23, 0.17, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:01:47 compute-0 nova_compute[192903]: 2025-10-06 14:01:47.423 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:01:47 compute-0 nova_compute[192903]: 2025-10-06 14:01:47.930 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:01:48 compute-0 nova_compute[192903]: 2025-10-06 14:01:48.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:48 compute-0 nova_compute[192903]: 2025-10-06 14:01:48.443 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:01:48 compute-0 nova_compute[192903]: 2025-10-06 14:01:48.444 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:01:48 compute-0 nova_compute[192903]: 2025-10-06 14:01:48.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:49 compute-0 podman[216931]: 2025-10-06 14:01:49.230815962 +0000 UTC m=+0.086762076 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct 06 14:01:49 compute-0 nova_compute[192903]: 2025-10-06 14:01:49.444 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:01:49 compute-0 nova_compute[192903]: 2025-10-06 14:01:49.445 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:01:49 compute-0 nova_compute[192903]: 2025-10-06 14:01:49.445 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:01:52 compute-0 podman[216952]: 2025-10-06 14:01:52.20431212 +0000 UTC m=+0.067032943 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Oct 06 14:01:53 compute-0 nova_compute[192903]: 2025-10-06 14:01:53.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:53 compute-0 nova_compute[192903]: 2025-10-06 14:01:53.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:58 compute-0 nova_compute[192903]: 2025-10-06 14:01:58.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:58 compute-0 nova_compute[192903]: 2025-10-06 14:01:58.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:01:59 compute-0 podman[203308]: time="2025-10-06T14:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:01:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:01:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2993 "" "Go-http-client/1.1"
Oct 06 14:01:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:59.936 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:f3:37 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f79c2b9daff04f20aa823813dfdde9e4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4cb572c5-2fe1-4cc2-9aac-d044653b4542) old=Port_Binding(mac=['fa:16:3e:b8:f3:37'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f79c2b9daff04f20aa823813dfdde9e4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:01:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:59.937 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4cb572c5-2fe1-4cc2-9aac-d044653b4542 in datapath 69d92bff-38df-455c-b731-a2864652e2a5 updated
Oct 06 14:01:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:59.938 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 69d92bff-38df-455c-b731-a2864652e2a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:01:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:01:59.939 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[862888f8-b15e-4d6c-9693-2f2e01abb3ab]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:02:01 compute-0 openstack_network_exporter[205500]: ERROR   14:02:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:02:01 compute-0 openstack_network_exporter[205500]: ERROR   14:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:02:01 compute-0 openstack_network_exporter[205500]: ERROR   14:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:02:01 compute-0 openstack_network_exporter[205500]: ERROR   14:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:02:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:02:01 compute-0 openstack_network_exporter[205500]: ERROR   14:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:02:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:02:03 compute-0 nova_compute[192903]: 2025-10-06 14:02:03.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:03 compute-0 nova_compute[192903]: 2025-10-06 14:02:03.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:04.129 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:02:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:04.129 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:02:04 compute-0 nova_compute[192903]: 2025-10-06 14:02:04.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:05.131 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:02:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:07.532 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:e3:2f 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-22b4e68c-8978-4364-a120-006a1892e04e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22b4e68c-8978-4364-a120-006a1892e04e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a9b117b-7a12-4870-bffa-29b55ed30670, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f55b23ea-7468-478e-97ea-9351e6d65a2e) old=Port_Binding(mac=['fa:16:3e:82:e3:2f'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-22b4e68c-8978-4364-a120-006a1892e04e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22b4e68c-8978-4364-a120-006a1892e04e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:02:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:07.533 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f55b23ea-7468-478e-97ea-9351e6d65a2e in datapath 22b4e68c-8978-4364-a120-006a1892e04e updated
Oct 06 14:02:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:07.534 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22b4e68c-8978-4364-a120-006a1892e04e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:02:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:07.535 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[71ec3ee3-f7a9-4eb3-b4dd-508d322535a5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:02:08 compute-0 podman[216975]: 2025-10-06 14:02:08.20892149 +0000 UTC m=+0.067371944 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:02:08 compute-0 nova_compute[192903]: 2025-10-06 14:02:08.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:08 compute-0 nova_compute[192903]: 2025-10-06 14:02:08.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:11 compute-0 podman[217000]: 2025-10-06 14:02:11.219075921 +0000 UTC m=+0.073769686 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct 06 14:02:11 compute-0 podman[217001]: 2025-10-06 14:02:11.232101013 +0000 UTC m=+0.089890472 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 14:02:11 compute-0 podman[216999]: 2025-10-06 14:02:11.270513591 +0000 UTC m=+0.125997747 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 06 14:02:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:11.351 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:02:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:11.352 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:02:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:02:11.352 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:02:13 compute-0 nova_compute[192903]: 2025-10-06 14:02:13.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:13 compute-0 nova_compute[192903]: 2025-10-06 14:02:13.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:18 compute-0 nova_compute[192903]: 2025-10-06 14:02:18.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:18 compute-0 nova_compute[192903]: 2025-10-06 14:02:18.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:20 compute-0 podman[217062]: 2025-10-06 14:02:20.223711469 +0000 UTC m=+0.085291278 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 14:02:23 compute-0 podman[217083]: 2025-10-06 14:02:23.22431066 +0000 UTC m=+0.087370685 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Oct 06 14:02:23 compute-0 nova_compute[192903]: 2025-10-06 14:02:23.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:23 compute-0 ovn_controller[95205]: 2025-10-06T14:02:23Z|00048|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct 06 14:02:23 compute-0 nova_compute[192903]: 2025-10-06 14:02:23.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:28 compute-0 nova_compute[192903]: 2025-10-06 14:02:28.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:28 compute-0 nova_compute[192903]: 2025-10-06 14:02:28.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:29 compute-0 podman[203308]: time="2025-10-06T14:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:02:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:02:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2997 "" "Go-http-client/1.1"
Oct 06 14:02:31 compute-0 openstack_network_exporter[205500]: ERROR   14:02:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:02:31 compute-0 openstack_network_exporter[205500]: ERROR   14:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:02:31 compute-0 openstack_network_exporter[205500]: ERROR   14:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:02:31 compute-0 openstack_network_exporter[205500]: ERROR   14:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:02:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:02:31 compute-0 openstack_network_exporter[205500]: ERROR   14:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:02:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:02:33 compute-0 nova_compute[192903]: 2025-10-06 14:02:33.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:33 compute-0 nova_compute[192903]: 2025-10-06 14:02:33.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:38 compute-0 nova_compute[192903]: 2025-10-06 14:02:38.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:38 compute-0 nova_compute[192903]: 2025-10-06 14:02:38.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:39 compute-0 podman[217105]: 2025-10-06 14:02:39.226948351 +0000 UTC m=+0.084640839 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:02:42 compute-0 podman[217130]: 2025-10-06 14:02:42.20495229 +0000 UTC m=+0.068717049 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:02:42 compute-0 podman[217131]: 2025-10-06 14:02:42.224080947 +0000 UTC m=+0.078210986 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 06 14:02:42 compute-0 podman[217129]: 2025-10-06 14:02:42.248482307 +0000 UTC m=+0.112087582 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4)
Oct 06 14:02:43 compute-0 nova_compute[192903]: 2025-10-06 14:02:43.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:43 compute-0 nova_compute[192903]: 2025-10-06 14:02:43.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:02:43 compute-0 nova_compute[192903]: 2025-10-06 14:02:43.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:44 compute-0 nova_compute[192903]: 2025-10-06 14:02:44.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:02:45 compute-0 nova_compute[192903]: 2025-10-06 14:02:45.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:02:45 compute-0 nova_compute[192903]: 2025-10-06 14:02:45.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:02:45 compute-0 nova_compute[192903]: 2025-10-06 14:02:45.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:02:47 compute-0 nova_compute[192903]: 2025-10-06 14:02:47.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.086 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.436 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.436 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.597 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.597 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.598 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.598 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.818 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.819 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.844 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.845 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5894MB free_disk=73.30627822875977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.845 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.845 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:48 compute-0 nova_compute[192903]: 2025-10-06 14:02:48.943 2 DEBUG nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:02:49 compute-0 nova_compute[192903]: 2025-10-06 14:02:49.501 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:02:50 compute-0 nova_compute[192903]: 2025-10-06 14:02:50.382 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 46246aa4-aa4f-4a8e-93ba-5fc685a531a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1797
Oct 06 14:02:50 compute-0 nova_compute[192903]: 2025-10-06 14:02:50.382 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:02:50 compute-0 nova_compute[192903]: 2025-10-06 14:02:50.383 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:02:48 up  1:03,  0 user,  load average: 0.30, 0.20, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:02:50 compute-0 nova_compute[192903]: 2025-10-06 14:02:50.414 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:02:50 compute-0 nova_compute[192903]: 2025-10-06 14:02:50.921 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:02:51 compute-0 podman[217190]: 2025-10-06 14:02:51.209087133 +0000 UTC m=+0.070257491 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4)
Oct 06 14:02:51 compute-0 nova_compute[192903]: 2025-10-06 14:02:51.432 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:02:51 compute-0 nova_compute[192903]: 2025-10-06 14:02:51.432 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.587s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:02:51 compute-0 nova_compute[192903]: 2025-10-06 14:02:51.432 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.931s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:02:51 compute-0 nova_compute[192903]: 2025-10-06 14:02:51.437 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:02:51 compute-0 nova_compute[192903]: 2025-10-06 14:02:51.438 2 INFO nova.compute.claims [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:02:51 compute-0 nova_compute[192903]: 2025-10-06 14:02:51.928 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:02:51 compute-0 nova_compute[192903]: 2025-10-06 14:02:51.928 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:02:51 compute-0 nova_compute[192903]: 2025-10-06 14:02:51.929 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:02:52 compute-0 nova_compute[192903]: 2025-10-06 14:02:52.482 2 DEBUG nova.compute.provider_tree [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:02:52 compute-0 nova_compute[192903]: 2025-10-06 14:02:52.991 2 DEBUG nova.scheduler.client.report [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:02:53 compute-0 nova_compute[192903]: 2025-10-06 14:02:53.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:53 compute-0 nova_compute[192903]: 2025-10-06 14:02:53.501 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:02:53 compute-0 nova_compute[192903]: 2025-10-06 14:02:53.502 2 DEBUG nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:02:53 compute-0 nova_compute[192903]: 2025-10-06 14:02:53.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:54 compute-0 nova_compute[192903]: 2025-10-06 14:02:54.014 2 DEBUG nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:02:54 compute-0 nova_compute[192903]: 2025-10-06 14:02:54.015 2 DEBUG nova.network.neutron [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:02:54 compute-0 nova_compute[192903]: 2025-10-06 14:02:54.015 2 WARNING neutronclient.v2_0.client [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:02:54 compute-0 nova_compute[192903]: 2025-10-06 14:02:54.015 2 WARNING neutronclient.v2_0.client [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:02:54 compute-0 podman[217211]: 2025-10-06 14:02:54.204069413 +0000 UTC m=+0.066822788 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6)
Oct 06 14:02:54 compute-0 nova_compute[192903]: 2025-10-06 14:02:54.524 2 INFO nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:02:54 compute-0 nova_compute[192903]: 2025-10-06 14:02:54.607 2 DEBUG nova.network.neutron [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Successfully created port: 367788c3-83c2-4360-a817-da04de69a6a2 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:02:55 compute-0 nova_compute[192903]: 2025-10-06 14:02:55.035 2 DEBUG nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:02:55 compute-0 nova_compute[192903]: 2025-10-06 14:02:55.782 2 DEBUG nova.network.neutron [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Successfully updated port: 367788c3-83c2-4360-a817-da04de69a6a2 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:02:55 compute-0 nova_compute[192903]: 2025-10-06 14:02:55.841 2 DEBUG nova.compute.manager [req-949d35e8-91fc-41fd-aa67-56867762dbb6 req-8c4d37ee-47a3-4a16-b55a-f64cf9adc11b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Received event network-changed-367788c3-83c2-4360-a817-da04de69a6a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:02:55 compute-0 nova_compute[192903]: 2025-10-06 14:02:55.841 2 DEBUG nova.compute.manager [req-949d35e8-91fc-41fd-aa67-56867762dbb6 req-8c4d37ee-47a3-4a16-b55a-f64cf9adc11b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Refreshing instance network info cache due to event network-changed-367788c3-83c2-4360-a817-da04de69a6a2. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:02:55 compute-0 nova_compute[192903]: 2025-10-06 14:02:55.842 2 DEBUG oslo_concurrency.lockutils [req-949d35e8-91fc-41fd-aa67-56867762dbb6 req-8c4d37ee-47a3-4a16-b55a-f64cf9adc11b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-46246aa4-aa4f-4a8e-93ba-5fc685a531a0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:02:55 compute-0 nova_compute[192903]: 2025-10-06 14:02:55.842 2 DEBUG oslo_concurrency.lockutils [req-949d35e8-91fc-41fd-aa67-56867762dbb6 req-8c4d37ee-47a3-4a16-b55a-f64cf9adc11b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-46246aa4-aa4f-4a8e-93ba-5fc685a531a0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:02:55 compute-0 nova_compute[192903]: 2025-10-06 14:02:55.843 2 DEBUG nova.network.neutron [req-949d35e8-91fc-41fd-aa67-56867762dbb6 req-8c4d37ee-47a3-4a16-b55a-f64cf9adc11b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Refreshing network info cache for port 367788c3-83c2-4360-a817-da04de69a6a2 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.055 2 DEBUG nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.057 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.057 2 INFO nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Creating image(s)
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.058 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "/var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.059 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "/var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.060 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "/var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.061 2 DEBUG oslo_utils.imageutils.format_inspector [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.067 2 DEBUG oslo_utils.imageutils.format_inspector [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.070 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.157 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.159 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.160 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.160 2 DEBUG oslo_utils.imageutils.format_inspector [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.167 2 DEBUG oslo_utils.imageutils.format_inspector [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.168 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.232 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.233 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.263 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.264 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.264 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.288 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "refresh_cache-46246aa4-aa4f-4a8e-93ba-5fc685a531a0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.310 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.311 2 DEBUG nova.virt.disk.api [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Checking if we can resize image /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.311 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.349 2 WARNING neutronclient.v2_0.client [req-949d35e8-91fc-41fd-aa67-56867762dbb6 req-8c4d37ee-47a3-4a16-b55a-f64cf9adc11b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.358 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.359 2 DEBUG nova.virt.disk.api [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Cannot resize image /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.359 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.359 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Ensure instance console log exists: /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.360 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.360 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.360 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.479 2 DEBUG nova.network.neutron [req-949d35e8-91fc-41fd-aa67-56867762dbb6 req-8c4d37ee-47a3-4a16-b55a-f64cf9adc11b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:02:56 compute-0 nova_compute[192903]: 2025-10-06 14:02:56.663 2 DEBUG nova.network.neutron [req-949d35e8-91fc-41fd-aa67-56867762dbb6 req-8c4d37ee-47a3-4a16-b55a-f64cf9adc11b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:02:57 compute-0 nova_compute[192903]: 2025-10-06 14:02:57.172 2 DEBUG oslo_concurrency.lockutils [req-949d35e8-91fc-41fd-aa67-56867762dbb6 req-8c4d37ee-47a3-4a16-b55a-f64cf9adc11b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-46246aa4-aa4f-4a8e-93ba-5fc685a531a0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:02:57 compute-0 nova_compute[192903]: 2025-10-06 14:02:57.174 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquired lock "refresh_cache-46246aa4-aa4f-4a8e-93ba-5fc685a531a0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:02:57 compute-0 nova_compute[192903]: 2025-10-06 14:02:57.175 2 DEBUG nova.network.neutron [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:02:57 compute-0 nova_compute[192903]: 2025-10-06 14:02:57.749 2 DEBUG nova.network.neutron [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.078 2 WARNING neutronclient.v2_0.client [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.237 2 DEBUG nova.network.neutron [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Updating instance_info_cache with network_info: [{"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.744 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Releasing lock "refresh_cache-46246aa4-aa4f-4a8e-93ba-5fc685a531a0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.744 2 DEBUG nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Instance network_info: |[{"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.748 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Start _get_guest_xml network_info=[{"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.752 2 WARNING nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.753 2 DEBUG nova.virt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-669924934', uuid='46246aa4-aa4f-4a8e-93ba-5fc685a531a0'), owner=OwnerMeta(userid='4beaed30a2ec47bb9b5f6adb81ede0f7', username='tempest-TestExecuteActionsViaActuator-1260248176-project-admin', projectid='20952eb66a9c4fd2905273fb8f800689', projectname='tempest-TestExecuteActionsViaActuator-1260248176'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759759378.753828) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.758 2 DEBUG nova.virt.libvirt.host [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.759 2 DEBUG nova.virt.libvirt.host [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.762 2 DEBUG nova.virt.libvirt.host [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.763 2 DEBUG nova.virt.libvirt.host [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.763 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.763 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.764 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.764 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.764 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.764 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.764 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.765 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.765 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.765 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.765 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.765 2 DEBUG nova.virt.hardware [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.768 2 DEBUG nova.virt.libvirt.vif [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:02:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-669924934',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-669924934',id=5,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-db5hsdgf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:02:55Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=46246aa4-aa4f-4a8e-93ba-5fc685a531a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.768 2 DEBUG nova.network.os_vif_util [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.769 2 DEBUG nova.network.os_vif_util [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:91:80,bridge_name='br-int',has_traffic_filtering=True,id=367788c3-83c2-4360-a817-da04de69a6a2,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap367788c3-83') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.769 2 DEBUG nova.objects.instance [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lazy-loading 'pci_devices' on Instance uuid 46246aa4-aa4f-4a8e-93ba-5fc685a531a0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:02:58 compute-0 nova_compute[192903]: 2025-10-06 14:02:58.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.278 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <uuid>46246aa4-aa4f-4a8e-93ba-5fc685a531a0</uuid>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <name>instance-00000005</name>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-669924934</nova:name>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:02:58</nova:creationTime>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:02:59 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:02:59 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:user uuid="4beaed30a2ec47bb9b5f6adb81ede0f7">tempest-TestExecuteActionsViaActuator-1260248176-project-admin</nova:user>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:project uuid="20952eb66a9c4fd2905273fb8f800689">tempest-TestExecuteActionsViaActuator-1260248176</nova:project>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         <nova:port uuid="367788c3-83c2-4360-a817-da04de69a6a2">
Oct 06 14:02:59 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <system>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <entry name="serial">46246aa4-aa4f-4a8e-93ba-5fc685a531a0</entry>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <entry name="uuid">46246aa4-aa4f-4a8e-93ba-5fc685a531a0</entry>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     </system>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <os>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   </os>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <features>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   </features>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk.config"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:7b:91:80"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <target dev="tap367788c3-83"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/console.log" append="off"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <video>
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     </video>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:02:59 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:02:59 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:02:59 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:02:59 compute-0 nova_compute[192903]: </domain>
Oct 06 14:02:59 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.280 2 DEBUG nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Preparing to wait for external event network-vif-plugged-367788c3-83c2-4360-a817-da04de69a6a2 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.280 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.280 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.281 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.281 2 DEBUG nova.virt.libvirt.vif [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:02:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-669924934',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-669924934',id=5,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-db5hsdgf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:02:55Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=46246aa4-aa4f-4a8e-93ba-5fc685a531a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.282 2 DEBUG nova.network.os_vif_util [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.282 2 DEBUG nova.network.os_vif_util [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:91:80,bridge_name='br-int',has_traffic_filtering=True,id=367788c3-83c2-4360-a817-da04de69a6a2,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap367788c3-83') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.283 2 DEBUG os_vif [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:91:80,bridge_name='br-int',has_traffic_filtering=True,id=367788c3-83c2-4360-a817-da04de69a6a2,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap367788c3-83') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.283 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.284 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.284 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2c7bd6ec-e2ca-5782-aa15-d1fe8c423e82', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.291 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap367788c3-83, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.291 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap367788c3-83, col_values=(('qos', UUID('09ed6a01-bf06-481e-9c73-3250a0051f5c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.292 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap367788c3-83, col_values=(('external_ids', {'iface-id': '367788c3-83c2-4360-a817-da04de69a6a2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:91:80', 'vm-uuid': '46246aa4-aa4f-4a8e-93ba-5fc685a531a0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:02:59 compute-0 NetworkManager[52035]: <info>  [1759759379.2943] manager: (tap367788c3-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:02:59 compute-0 nova_compute[192903]: 2025-10-06 14:02:59.300 2 INFO os_vif [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:91:80,bridge_name='br-int',has_traffic_filtering=True,id=367788c3-83c2-4360-a817-da04de69a6a2,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap367788c3-83')
Oct 06 14:02:59 compute-0 podman[203308]: time="2025-10-06T14:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:02:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:02:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2998 "" "Go-http-client/1.1"
Oct 06 14:03:00 compute-0 nova_compute[192903]: 2025-10-06 14:03:00.848 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:03:00 compute-0 nova_compute[192903]: 2025-10-06 14:03:00.849 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:03:00 compute-0 nova_compute[192903]: 2025-10-06 14:03:00.849 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] No VIF found with MAC fa:16:3e:7b:91:80, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:03:00 compute-0 nova_compute[192903]: 2025-10-06 14:03:00.850 2 INFO nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Using config drive
Oct 06 14:03:01 compute-0 nova_compute[192903]: 2025-10-06 14:03:01.365 2 WARNING neutronclient.v2_0.client [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:03:01 compute-0 openstack_network_exporter[205500]: ERROR   14:03:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:03:01 compute-0 openstack_network_exporter[205500]: ERROR   14:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:03:01 compute-0 openstack_network_exporter[205500]: ERROR   14:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:03:01 compute-0 openstack_network_exporter[205500]: ERROR   14:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:03:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:03:01 compute-0 openstack_network_exporter[205500]: ERROR   14:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:03:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:03:01 compute-0 nova_compute[192903]: 2025-10-06 14:03:01.841 2 INFO nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Creating config drive at /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk.config
Oct 06 14:03:01 compute-0 nova_compute[192903]: 2025-10-06 14:03:01.852 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp36z4ul1w execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:03:01 compute-0 nova_compute[192903]: 2025-10-06 14:03:01.988 2 DEBUG oslo_concurrency.processutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp36z4ul1w" returned: 0 in 0.136s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:03:02 compute-0 kernel: tap367788c3-83: entered promiscuous mode
Oct 06 14:03:02 compute-0 NetworkManager[52035]: <info>  [1759759382.0751] manager: (tap367788c3-83): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Oct 06 14:03:02 compute-0 ovn_controller[95205]: 2025-10-06T14:03:02Z|00049|binding|INFO|Claiming lport 367788c3-83c2-4360-a817-da04de69a6a2 for this chassis.
Oct 06 14:03:02 compute-0 ovn_controller[95205]: 2025-10-06T14:03:02Z|00050|binding|INFO|367788c3-83c2-4360-a817-da04de69a6a2: Claiming fa:16:3e:7b:91:80 10.100.0.8
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.102 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:91:80 10.100.0.8'], port_security=['fa:16:3e:7b:91:80 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '46246aa4-aa4f-4a8e-93ba-5fc685a531a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=367788c3-83c2-4360-a817-da04de69a6a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.103 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 367788c3-83c2-4360-a817-da04de69a6a2 in datapath 69d92bff-38df-455c-b731-a2864652e2a5 bound to our chassis
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.104 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.124 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8c04dd-e3f1-424a-a09e-8c2ca4a3fdc7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.124 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap69d92bff-31 in ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:03:02 compute-0 systemd-udevd[217266]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.133 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap69d92bff-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.133 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5bc9e98f-4523-4b2e-ba36-948dbc036872]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.134 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[94ba470c-8c60-42c0-84f8-670a95908bd6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 systemd-machined[152985]: New machine qemu-2-instance-00000005.
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:02 compute-0 ovn_controller[95205]: 2025-10-06T14:03:02Z|00051|binding|INFO|Setting lport 367788c3-83c2-4360-a817-da04de69a6a2 ovn-installed in OVS
Oct 06 14:03:02 compute-0 ovn_controller[95205]: 2025-10-06T14:03:02Z|00052|binding|INFO|Setting lport 367788c3-83c2-4360-a817-da04de69a6a2 up in Southbound
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:02 compute-0 NetworkManager[52035]: <info>  [1759759382.1506] device (tap367788c3-83): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:03:02 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Oct 06 14:03:02 compute-0 NetworkManager[52035]: <info>  [1759759382.1533] device (tap367788c3-83): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.152 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd552db-124e-49a9-bc71-25862d467dfd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.172 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1a2b70-5a19-48b7-b444-9e887a688415]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.212 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[57137f7f-b264-455f-b955-4a4d14633906]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.218 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a570fdca-9dc6-4dd8-997c-72da4d8c9f8a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 systemd-udevd[217269]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:03:02 compute-0 NetworkManager[52035]: <info>  [1759759382.2192] manager: (tap69d92bff-30): new Veth device (/org/freedesktop/NetworkManager/Devices/28)
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.266 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0b620a-a7b9-49ea-8ab9-1240124dc0ce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.270 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba42211-874b-47b7-98f1-cdf40658d81c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 NetworkManager[52035]: <info>  [1759759382.3038] device (tap69d92bff-30): carrier: link connected
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.313 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[c9fc068e-6da5-476f-a910-88d85a3c20d1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.334 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7521d865-fd60-4c0e-942d-4825ade3f70d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 20193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217298, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.355 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[54076170-f69b-4f22-ae3e-fdb32349c9db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb8:f337'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384394, 'tstamp': 384394}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217299, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.380 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd9a2e8-d624-4c8d-b347-9b0bde7b3fda]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 20193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217300, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.425 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[69934b2a-5bc3-43e4-931f-e07abee9a50f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.513 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[53b7bbd1-928d-40a7-8d6f-8baff8e41093]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.515 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.515 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.516 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69d92bff-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:03:02 compute-0 NetworkManager[52035]: <info>  [1759759382.5197] manager: (tap69d92bff-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Oct 06 14:03:02 compute-0 kernel: tap69d92bff-30: entered promiscuous mode
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.522 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69d92bff-30, col_values=(('external_ids', {'iface-id': '4cb572c5-2fe1-4cc2-9aac-d044653b4542'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:03:02 compute-0 ovn_controller[95205]: 2025-10-06T14:03:02Z|00053|binding|INFO|Releasing lport 4cb572c5-2fe1-4cc2-9aac-d044653b4542 from this chassis (sb_readonly=0)
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.549 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[826ab958-7c0d-4a4f-9875-acade5ea9f58]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.550 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.550 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.550 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 69d92bff-38df-455c-b731-a2864652e2a5 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.551 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.551 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb9bd1d-178d-4b8c-a5ad-4a2c5d82ee51]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.552 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.552 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8362ac64-6966-4fd3-95bd-af22470dc056]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.553 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:03:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:02.554 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'env', 'PROCESS_TAG=haproxy-69d92bff-38df-455c-b731-a2864652e2a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/69d92bff-38df-455c-b731-a2864652e2a5.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.811 2 DEBUG nova.compute.manager [req-249648f3-cd68-419c-ad58-e1826c682439 req-7425061a-211a-4451-8acd-fc6f67fc5f67 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Received event network-vif-plugged-367788c3-83c2-4360-a817-da04de69a6a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.812 2 DEBUG oslo_concurrency.lockutils [req-249648f3-cd68-419c-ad58-e1826c682439 req-7425061a-211a-4451-8acd-fc6f67fc5f67 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.812 2 DEBUG oslo_concurrency.lockutils [req-249648f3-cd68-419c-ad58-e1826c682439 req-7425061a-211a-4451-8acd-fc6f67fc5f67 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.812 2 DEBUG oslo_concurrency.lockutils [req-249648f3-cd68-419c-ad58-e1826c682439 req-7425061a-211a-4451-8acd-fc6f67fc5f67 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.812 2 DEBUG nova.compute.manager [req-249648f3-cd68-419c-ad58-e1826c682439 req-7425061a-211a-4451-8acd-fc6f67fc5f67 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Processing event network-vif-plugged-367788c3-83c2-4360-a817-da04de69a6a2 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:03:02 compute-0 podman[217338]: 2025-10-06 14:03:02.931833895 +0000 UTC m=+0.046728054 container create 00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 06 14:03:02 compute-0 systemd[1]: Started libpod-conmon-00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00.scope.
Oct 06 14:03:02 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:03:02 compute-0 nova_compute[192903]: 2025-10-06 14:03:02.998 2 DEBUG nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:03:03 compute-0 podman[217338]: 2025-10-06 14:03:02.906563602 +0000 UTC m=+0.021457771 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.003 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:03:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c57c60c8a133e7bdce6c167312b77fdf0a4f4f4def04b71056d062caa7e380d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.007 2 INFO nova.virt.libvirt.driver [-] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Instance spawned successfully.
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.007 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:03:03 compute-0 podman[217338]: 2025-10-06 14:03:03.025337964 +0000 UTC m=+0.140232143 container init 00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:03:03 compute-0 podman[217338]: 2025-10-06 14:03:03.030600216 +0000 UTC m=+0.145494375 container start 00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:03:03 compute-0 neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5[217353]: [NOTICE]   (217357) : New worker (217359) forked
Oct 06 14:03:03 compute-0 neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5[217353]: [NOTICE]   (217357) : Loading success.
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.523 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.523 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.524 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.525 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.526 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:03:03 compute-0 nova_compute[192903]: 2025-10-06 14:03:03.527 2 DEBUG nova.virt.libvirt.driver [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.042 2 INFO nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Took 7.99 seconds to spawn the instance on the hypervisor.
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.043 2 DEBUG nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.582 2 INFO nova.compute.manager [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Took 15.13 seconds to build instance.
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.890 2 DEBUG nova.compute.manager [req-661281a3-b76b-4c37-8471-7f82c8d6a787 req-e30ea06c-2c3d-432e-bb34-ea850a8f5f32 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Received event network-vif-plugged-367788c3-83c2-4360-a817-da04de69a6a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.891 2 DEBUG oslo_concurrency.lockutils [req-661281a3-b76b-4c37-8471-7f82c8d6a787 req-e30ea06c-2c3d-432e-bb34-ea850a8f5f32 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.892 2 DEBUG oslo_concurrency.lockutils [req-661281a3-b76b-4c37-8471-7f82c8d6a787 req-e30ea06c-2c3d-432e-bb34-ea850a8f5f32 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.892 2 DEBUG oslo_concurrency.lockutils [req-661281a3-b76b-4c37-8471-7f82c8d6a787 req-e30ea06c-2c3d-432e-bb34-ea850a8f5f32 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.893 2 DEBUG nova.compute.manager [req-661281a3-b76b-4c37-8471-7f82c8d6a787 req-e30ea06c-2c3d-432e-bb34-ea850a8f5f32 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] No waiting events found dispatching network-vif-plugged-367788c3-83c2-4360-a817-da04de69a6a2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:03:04 compute-0 nova_compute[192903]: 2025-10-06 14:03:04.893 2 WARNING nova.compute.manager [req-661281a3-b76b-4c37-8471-7f82c8d6a787 req-e30ea06c-2c3d-432e-bb34-ea850a8f5f32 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Received unexpected event network-vif-plugged-367788c3-83c2-4360-a817-da04de69a6a2 for instance with vm_state active and task_state None.
Oct 06 14:03:05 compute-0 nova_compute[192903]: 2025-10-06 14:03:05.086 2 DEBUG oslo_concurrency.lockutils [None req-06334cf1-9384-419d-8866-dbe9e232c099 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.650s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:03:08 compute-0 nova_compute[192903]: 2025-10-06 14:03:08.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:09 compute-0 nova_compute[192903]: 2025-10-06 14:03:09.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:10 compute-0 podman[217368]: 2025-10-06 14:03:10.205753707 +0000 UTC m=+0.075230535 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:03:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:11.353 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:03:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:11.354 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:03:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:11.354 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:03:13 compute-0 podman[217395]: 2025-10-06 14:03:13.244300723 +0000 UTC m=+0.083714675 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:03:13 compute-0 podman[217396]: 2025-10-06 14:03:13.245747682 +0000 UTC m=+0.080126788 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 06 14:03:13 compute-0 nova_compute[192903]: 2025-10-06 14:03:13.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:13 compute-0 podman[217394]: 2025-10-06 14:03:13.308528049 +0000 UTC m=+0.154903709 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct 06 14:03:14 compute-0 nova_compute[192903]: 2025-10-06 14:03:14.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:15 compute-0 ovn_controller[95205]: 2025-10-06T14:03:15Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:91:80 10.100.0.8
Oct 06 14:03:15 compute-0 ovn_controller[95205]: 2025-10-06T14:03:15Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:91:80 10.100.0.8
Oct 06 14:03:18 compute-0 nova_compute[192903]: 2025-10-06 14:03:18.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:19 compute-0 nova_compute[192903]: 2025-10-06 14:03:19.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:22 compute-0 podman[217470]: 2025-10-06 14:03:22.230221925 +0000 UTC m=+0.089726997 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:03:23 compute-0 nova_compute[192903]: 2025-10-06 14:03:23.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:24 compute-0 nova_compute[192903]: 2025-10-06 14:03:24.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:25 compute-0 podman[217490]: 2025-10-06 14:03:25.238930553 +0000 UTC m=+0.095576805 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Oct 06 14:03:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:27.614 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:03:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:27.615 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:03:27 compute-0 nova_compute[192903]: 2025-10-06 14:03:27.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:28 compute-0 nova_compute[192903]: 2025-10-06 14:03:28.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:29 compute-0 nova_compute[192903]: 2025-10-06 14:03:29.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:29 compute-0 podman[203308]: time="2025-10-06T14:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:03:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:03:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3465 "" "Go-http-client/1.1"
Oct 06 14:03:31 compute-0 openstack_network_exporter[205500]: ERROR   14:03:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:03:31 compute-0 openstack_network_exporter[205500]: ERROR   14:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:03:31 compute-0 openstack_network_exporter[205500]: ERROR   14:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:03:31 compute-0 openstack_network_exporter[205500]: ERROR   14:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:03:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:03:31 compute-0 openstack_network_exporter[205500]: ERROR   14:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:03:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:03:33 compute-0 nova_compute[192903]: 2025-10-06 14:03:33.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:34 compute-0 nova_compute[192903]: 2025-10-06 14:03:34.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:03:36.616 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:03:38 compute-0 nova_compute[192903]: 2025-10-06 14:03:38.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:39 compute-0 nova_compute[192903]: 2025-10-06 14:03:39.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:39 compute-0 nova_compute[192903]: 2025-10-06 14:03:39.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:39 compute-0 nova_compute[192903]: 2025-10-06 14:03:39.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 14:03:40 compute-0 nova_compute[192903]: 2025-10-06 14:03:40.090 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 14:03:41 compute-0 podman[217514]: 2025-10-06 14:03:41.192789182 +0000 UTC m=+0.057915577 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:03:43 compute-0 nova_compute[192903]: 2025-10-06 14:03:43.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:44 compute-0 podman[217539]: 2025-10-06 14:03:44.207736532 +0000 UTC m=+0.067642400 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct 06 14:03:44 compute-0 podman[217545]: 2025-10-06 14:03:44.208166263 +0000 UTC m=+0.055595144 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 06 14:03:44 compute-0 podman[217538]: 2025-10-06 14:03:44.230314512 +0000 UTC m=+0.099394218 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:03:44 compute-0 nova_compute[192903]: 2025-10-06 14:03:44.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:45 compute-0 nova_compute[192903]: 2025-10-06 14:03:45.091 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:46 compute-0 nova_compute[192903]: 2025-10-06 14:03:46.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:46 compute-0 nova_compute[192903]: 2025-10-06 14:03:46.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:47 compute-0 nova_compute[192903]: 2025-10-06 14:03:47.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:47 compute-0 nova_compute[192903]: 2025-10-06 14:03:47.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:03:47 compute-0 nova_compute[192903]: 2025-10-06 14:03:47.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:48 compute-0 nova_compute[192903]: 2025-10-06 14:03:48.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:03:48 compute-0 nova_compute[192903]: 2025-10-06 14:03:48.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:03:48 compute-0 nova_compute[192903]: 2025-10-06 14:03:48.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:03:48 compute-0 nova_compute[192903]: 2025-10-06 14:03:48.097 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:03:48 compute-0 nova_compute[192903]: 2025-10-06 14:03:48.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.141 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.227 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.229 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.295 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.528 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.530 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.554 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.555 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5701MB free_disk=73.27701568603516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.555 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:03:49 compute-0 nova_compute[192903]: 2025-10-06 14:03:49.555 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:03:50 compute-0 nova_compute[192903]: 2025-10-06 14:03:50.605 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 46246aa4-aa4f-4a8e-93ba-5fc685a531a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:03:50 compute-0 nova_compute[192903]: 2025-10-06 14:03:50.606 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:03:50 compute-0 nova_compute[192903]: 2025-10-06 14:03:50.606 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:03:49 up  1:04,  0 user,  load average: 0.86, 0.37, 0.41\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_20952eb66a9c4fd2905273fb8f800689': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:03:50 compute-0 nova_compute[192903]: 2025-10-06 14:03:50.656 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:03:51 compute-0 nova_compute[192903]: 2025-10-06 14:03:51.166 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:03:51 compute-0 nova_compute[192903]: 2025-10-06 14:03:51.682 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:03:51 compute-0 nova_compute[192903]: 2025-10-06 14:03:51.682 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:03:51 compute-0 nova_compute[192903]: 2025-10-06 14:03:51.683 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:51 compute-0 nova_compute[192903]: 2025-10-06 14:03:51.683 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 14:03:52 compute-0 nova_compute[192903]: 2025-10-06 14:03:52.192 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:53 compute-0 podman[217607]: 2025-10-06 14:03:53.250950801 +0000 UTC m=+0.107323203 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 14:03:53 compute-0 nova_compute[192903]: 2025-10-06 14:03:53.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:53 compute-0 nova_compute[192903]: 2025-10-06 14:03:53.698 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:53 compute-0 nova_compute[192903]: 2025-10-06 14:03:53.699 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:53 compute-0 nova_compute[192903]: 2025-10-06 14:03:53.699 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:03:54 compute-0 nova_compute[192903]: 2025-10-06 14:03:54.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:56 compute-0 podman[217627]: 2025-10-06 14:03:56.222907897 +0000 UTC m=+0.071507275 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350)
Oct 06 14:03:58 compute-0 nova_compute[192903]: 2025-10-06 14:03:58.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:59 compute-0 nova_compute[192903]: 2025-10-06 14:03:59.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:03:59 compute-0 podman[203308]: time="2025-10-06T14:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:03:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:03:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3465 "" "Go-http-client/1.1"
Oct 06 14:04:01 compute-0 openstack_network_exporter[205500]: ERROR   14:04:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:04:01 compute-0 openstack_network_exporter[205500]: ERROR   14:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:04:01 compute-0 openstack_network_exporter[205500]: ERROR   14:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:04:01 compute-0 openstack_network_exporter[205500]: ERROR   14:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:04:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:04:01 compute-0 openstack_network_exporter[205500]: ERROR   14:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:04:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:04:03 compute-0 nova_compute[192903]: 2025-10-06 14:04:03.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:04 compute-0 nova_compute[192903]: 2025-10-06 14:04:04.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:05 compute-0 nova_compute[192903]: 2025-10-06 14:04:05.404 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:05 compute-0 nova_compute[192903]: 2025-10-06 14:04:05.405 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:05 compute-0 nova_compute[192903]: 2025-10-06 14:04:05.910 2 DEBUG nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:04:06 compute-0 nova_compute[192903]: 2025-10-06 14:04:06.460 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:06 compute-0 nova_compute[192903]: 2025-10-06 14:04:06.461 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:06 compute-0 nova_compute[192903]: 2025-10-06 14:04:06.469 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:04:06 compute-0 nova_compute[192903]: 2025-10-06 14:04:06.470 2 INFO nova.compute.claims [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:04:07 compute-0 nova_compute[192903]: 2025-10-06 14:04:07.566 2 DEBUG nova.compute.provider_tree [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:04:08 compute-0 nova_compute[192903]: 2025-10-06 14:04:08.074 2 DEBUG nova.scheduler.client.report [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:04:08 compute-0 nova_compute[192903]: 2025-10-06 14:04:08.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:08 compute-0 nova_compute[192903]: 2025-10-06 14:04:08.584 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:08 compute-0 nova_compute[192903]: 2025-10-06 14:04:08.587 2 DEBUG nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:04:09 compute-0 nova_compute[192903]: 2025-10-06 14:04:09.101 2 DEBUG nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:04:09 compute-0 nova_compute[192903]: 2025-10-06 14:04:09.102 2 DEBUG nova.network.neutron [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:04:09 compute-0 nova_compute[192903]: 2025-10-06 14:04:09.103 2 WARNING neutronclient.v2_0.client [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:04:09 compute-0 nova_compute[192903]: 2025-10-06 14:04:09.103 2 WARNING neutronclient.v2_0.client [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:04:09 compute-0 nova_compute[192903]: 2025-10-06 14:04:09.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:09 compute-0 nova_compute[192903]: 2025-10-06 14:04:09.611 2 INFO nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:04:10 compute-0 nova_compute[192903]: 2025-10-06 14:04:10.121 2 DEBUG nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:04:10 compute-0 nova_compute[192903]: 2025-10-06 14:04:10.956 2 DEBUG nova.network.neutron [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Successfully created port: b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.145 2 DEBUG nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.147 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.147 2 INFO nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Creating image(s)
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.147 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "/var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.148 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "/var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.148 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "/var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.149 2 DEBUG oslo_utils.imageutils.format_inspector [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.152 2 DEBUG oslo_utils.imageutils.format_inspector [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.154 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.242 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.244 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.245 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.246 2 DEBUG oslo_utils.imageutils.format_inspector [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.254 2 DEBUG oslo_utils.imageutils.format_inspector [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.255 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.343 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.345 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:11.354 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:11.355 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:11.355 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.390 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.391 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.391 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.453 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.454 2 DEBUG nova.virt.disk.api [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Checking if we can resize image /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.454 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.504 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.504 2 DEBUG nova.virt.disk.api [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Cannot resize image /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.505 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.505 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Ensure instance console log exists: /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.506 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.506 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:11 compute-0 nova_compute[192903]: 2025-10-06 14:04:11.506 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:12 compute-0 podman[217665]: 2025-10-06 14:04:12.235533498 +0000 UTC m=+0.083421697 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:04:13 compute-0 nova_compute[192903]: 2025-10-06 14:04:13.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:13 compute-0 nova_compute[192903]: 2025-10-06 14:04:13.864 2 DEBUG nova.network.neutron [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Successfully updated port: b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:04:13 compute-0 nova_compute[192903]: 2025-10-06 14:04:13.937 2 DEBUG nova.compute.manager [req-ff2bdaff-218b-4c8d-a5e3-7fda1fcbec9e req-c36b8779-4ca5-417b-adcb-d9566c6b04b9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Received event network-changed-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:04:13 compute-0 nova_compute[192903]: 2025-10-06 14:04:13.937 2 DEBUG nova.compute.manager [req-ff2bdaff-218b-4c8d-a5e3-7fda1fcbec9e req-c36b8779-4ca5-417b-adcb-d9566c6b04b9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Refreshing instance network info cache due to event network-changed-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:04:13 compute-0 nova_compute[192903]: 2025-10-06 14:04:13.938 2 DEBUG oslo_concurrency.lockutils [req-ff2bdaff-218b-4c8d-a5e3-7fda1fcbec9e req-c36b8779-4ca5-417b-adcb-d9566c6b04b9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:04:13 compute-0 nova_compute[192903]: 2025-10-06 14:04:13.938 2 DEBUG oslo_concurrency.lockutils [req-ff2bdaff-218b-4c8d-a5e3-7fda1fcbec9e req-c36b8779-4ca5-417b-adcb-d9566c6b04b9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:04:13 compute-0 nova_compute[192903]: 2025-10-06 14:04:13.938 2 DEBUG nova.network.neutron [req-ff2bdaff-218b-4c8d-a5e3-7fda1fcbec9e req-c36b8779-4ca5-417b-adcb-d9566c6b04b9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Refreshing network info cache for port b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:04:14 compute-0 nova_compute[192903]: 2025-10-06 14:04:14.371 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "refresh_cache-ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:04:14 compute-0 nova_compute[192903]: 2025-10-06 14:04:14.445 2 WARNING neutronclient.v2_0.client [req-ff2bdaff-218b-4c8d-a5e3-7fda1fcbec9e req-c36b8779-4ca5-417b-adcb-d9566c6b04b9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:04:14 compute-0 nova_compute[192903]: 2025-10-06 14:04:14.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:14 compute-0 nova_compute[192903]: 2025-10-06 14:04:14.736 2 DEBUG nova.network.neutron [req-ff2bdaff-218b-4c8d-a5e3-7fda1fcbec9e req-c36b8779-4ca5-417b-adcb-d9566c6b04b9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:04:14 compute-0 nova_compute[192903]: 2025-10-06 14:04:14.898 2 DEBUG nova.network.neutron [req-ff2bdaff-218b-4c8d-a5e3-7fda1fcbec9e req-c36b8779-4ca5-417b-adcb-d9566c6b04b9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:04:15 compute-0 podman[217691]: 2025-10-06 14:04:15.196703122 +0000 UTC m=+0.053100237 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 14:04:15 compute-0 podman[217690]: 2025-10-06 14:04:15.228589674 +0000 UTC m=+0.085698378 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 06 14:04:15 compute-0 podman[217689]: 2025-10-06 14:04:15.247751792 +0000 UTC m=+0.104135057 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 14:04:15 compute-0 nova_compute[192903]: 2025-10-06 14:04:15.405 2 DEBUG oslo_concurrency.lockutils [req-ff2bdaff-218b-4c8d-a5e3-7fda1fcbec9e req-c36b8779-4ca5-417b-adcb-d9566c6b04b9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:04:15 compute-0 nova_compute[192903]: 2025-10-06 14:04:15.406 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquired lock "refresh_cache-ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:04:15 compute-0 nova_compute[192903]: 2025-10-06 14:04:15.407 2 DEBUG nova.network.neutron [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:04:16 compute-0 nova_compute[192903]: 2025-10-06 14:04:16.319 2 DEBUG nova.network.neutron [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:04:16 compute-0 nova_compute[192903]: 2025-10-06 14:04:16.669 2 WARNING neutronclient.v2_0.client [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:04:16 compute-0 nova_compute[192903]: 2025-10-06 14:04:16.858 2 DEBUG nova.network.neutron [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Updating instance_info_cache with network_info: [{"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.366 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Releasing lock "refresh_cache-ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.366 2 DEBUG nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Instance network_info: |[{"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.369 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Start _get_guest_xml network_info=[{"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.374 2 WARNING nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.375 2 DEBUG nova.virt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-2070851038', uuid='ea9b1b2c-e123-4a8b-a2ef-f29e14732d20'), owner=OwnerMeta(userid='4beaed30a2ec47bb9b5f6adb81ede0f7', username='tempest-TestExecuteActionsViaActuator-1260248176-project-admin', projectid='20952eb66a9c4fd2905273fb8f800689', projectname='tempest-TestExecuteActionsViaActuator-1260248176'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759759457.3755176) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.382 2 DEBUG nova.virt.libvirt.host [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.382 2 DEBUG nova.virt.libvirt.host [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.386 2 DEBUG nova.virt.libvirt.host [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.386 2 DEBUG nova.virt.libvirt.host [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.387 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.387 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.388 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.388 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.388 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.389 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.389 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.389 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.389 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.390 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.390 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.390 2 DEBUG nova.virt.hardware [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.395 2 DEBUG nova.virt.libvirt.vif [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:04:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2070851038',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2070851038',id=7,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-y3zwwbvy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:04:10Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=ea9b1b2c-e123-4a8b-a2ef-f29e14732d20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.395 2 DEBUG nova.network.os_vif_util [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.396 2 DEBUG nova.network.os_vif_util [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:4a:68,bridge_name='br-int',has_traffic_filtering=True,id=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f44cfa-c6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.397 2 DEBUG nova.objects.instance [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lazy-loading 'pci_devices' on Instance uuid ea9b1b2c-e123-4a8b-a2ef-f29e14732d20 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.909 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <uuid>ea9b1b2c-e123-4a8b-a2ef-f29e14732d20</uuid>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <name>instance-00000007</name>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-2070851038</nova:name>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:04:17</nova:creationTime>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:04:17 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:04:17 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:user uuid="4beaed30a2ec47bb9b5f6adb81ede0f7">tempest-TestExecuteActionsViaActuator-1260248176-project-admin</nova:user>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:project uuid="20952eb66a9c4fd2905273fb8f800689">tempest-TestExecuteActionsViaActuator-1260248176</nova:project>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         <nova:port uuid="b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8">
Oct 06 14:04:17 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <system>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <entry name="serial">ea9b1b2c-e123-4a8b-a2ef-f29e14732d20</entry>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <entry name="uuid">ea9b1b2c-e123-4a8b-a2ef-f29e14732d20</entry>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     </system>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <os>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   </os>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <features>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   </features>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk.config"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:77:4a:68"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <target dev="tapb1f44cfa-c6"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/console.log" append="off"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <video>
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     </video>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:04:17 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:04:17 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:04:17 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:04:17 compute-0 nova_compute[192903]: </domain>
Oct 06 14:04:17 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.911 2 DEBUG nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Preparing to wait for external event network-vif-plugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.911 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.911 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.911 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.912 2 DEBUG nova.virt.libvirt.vif [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:04:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2070851038',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2070851038',id=7,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-y3zwwbvy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:04:10Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=ea9b1b2c-e123-4a8b-a2ef-f29e14732d20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.913 2 DEBUG nova.network.os_vif_util [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.913 2 DEBUG nova.network.os_vif_util [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:4a:68,bridge_name='br-int',has_traffic_filtering=True,id=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f44cfa-c6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.914 2 DEBUG os_vif [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:4a:68,bridge_name='br-int',has_traffic_filtering=True,id=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f44cfa-c6') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.915 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.915 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.916 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e0c23358-46f6-555e-ab58-a9f85260fa7f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.923 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1f44cfa-c6, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.923 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb1f44cfa-c6, col_values=(('qos', UUID('e89a54ce-ac98-4fa8-b2c8-b06e613ac071')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.924 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb1f44cfa-c6, col_values=(('external_ids', {'iface-id': 'b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:4a:68', 'vm-uuid': 'ea9b1b2c-e123-4a8b-a2ef-f29e14732d20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:17 compute-0 NetworkManager[52035]: <info>  [1759759457.9274] manager: (tapb1f44cfa-c6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:17 compute-0 nova_compute[192903]: 2025-10-06 14:04:17.936 2 INFO os_vif [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:4a:68,bridge_name='br-int',has_traffic_filtering=True,id=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f44cfa-c6')
Oct 06 14:04:18 compute-0 nova_compute[192903]: 2025-10-06 14:04:18.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:19 compute-0 nova_compute[192903]: 2025-10-06 14:04:19.536 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:04:19 compute-0 nova_compute[192903]: 2025-10-06 14:04:19.536 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:04:19 compute-0 nova_compute[192903]: 2025-10-06 14:04:19.537 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] No VIF found with MAC fa:16:3e:77:4a:68, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:04:19 compute-0 nova_compute[192903]: 2025-10-06 14:04:19.538 2 INFO nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Using config drive
Oct 06 14:04:20 compute-0 nova_compute[192903]: 2025-10-06 14:04:20.104 2 WARNING neutronclient.v2_0.client [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:04:20 compute-0 nova_compute[192903]: 2025-10-06 14:04:20.910 2 INFO nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Creating config drive at /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk.config
Oct 06 14:04:20 compute-0 nova_compute[192903]: 2025-10-06 14:04:20.917 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpjwmn8z49 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.056 2 DEBUG oslo_concurrency.processutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpjwmn8z49" returned: 0 in 0.139s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:21 compute-0 kernel: tapb1f44cfa-c6: entered promiscuous mode
Oct 06 14:04:21 compute-0 NetworkManager[52035]: <info>  [1759759461.1318] manager: (tapb1f44cfa-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Oct 06 14:04:21 compute-0 ovn_controller[95205]: 2025-10-06T14:04:21Z|00054|binding|INFO|Claiming lport b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 for this chassis.
Oct 06 14:04:21 compute-0 ovn_controller[95205]: 2025-10-06T14:04:21Z|00055|binding|INFO|b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8: Claiming fa:16:3e:77:4a:68 10.100.0.7
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.140 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:4a:68 10.100.0.7'], port_security=['fa:16:3e:77:4a:68 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ea9b1b2c-e123-4a8b-a2ef-f29e14732d20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.141 104072 INFO neutron.agent.ovn.metadata.agent [-] Port b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 in datapath 69d92bff-38df-455c-b731-a2864652e2a5 bound to our chassis
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.143 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:04:21 compute-0 ovn_controller[95205]: 2025-10-06T14:04:21Z|00056|binding|INFO|Setting lport b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 ovn-installed in OVS
Oct 06 14:04:21 compute-0 ovn_controller[95205]: 2025-10-06T14:04:21Z|00057|binding|INFO|Setting lport b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 up in Southbound
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.165 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9ce393-1c41-45fe-8ed6-fff28fd45e51]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:21 compute-0 systemd-machined[152985]: New machine qemu-3-instance-00000007.
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.194 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[a6467dae-1575-451a-a409-ff40d1556c12]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.197 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[4439491b-1ef9-48c1-85f2-c1f243cf7e74]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:04:21 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Oct 06 14:04:21 compute-0 systemd-udevd[217787]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.229 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[aeea5350-eac7-41c8-876e-a9ea42a96eb7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:04:21 compute-0 NetworkManager[52035]: <info>  [1759759461.2387] device (tapb1f44cfa-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:04:21 compute-0 NetworkManager[52035]: <info>  [1759759461.2402] device (tapb1f44cfa-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.249 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[04dcd432-0612-46ff-aba8-e0716351484a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 20193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217789, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.266 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0b3cfa-a63d-4420-ac0b-2eb09e0fe7eb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384410, 'tstamp': 384410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217793, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384414, 'tstamp': 384414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217793, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.267 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.270 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69d92bff-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.271 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.271 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69d92bff-30, col_values=(('external_ids', {'iface-id': '4cb572c5-2fe1-4cc2-9aac-d044653b4542'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.271 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:04:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:21.272 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[d409e6b8-7420-4902-8d6a-4981511e049a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-69d92bff-38df-455c-b731-a2864652e2a5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 69d92bff-38df-455c-b731-a2864652e2a5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.915 2 DEBUG nova.compute.manager [req-830a5418-f8ef-4ad8-9120-7ab0e216949c req-75729399-b9fd-4e0f-9881-c7db590ec03b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Received event network-vif-plugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.916 2 DEBUG oslo_concurrency.lockutils [req-830a5418-f8ef-4ad8-9120-7ab0e216949c req-75729399-b9fd-4e0f-9881-c7db590ec03b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.917 2 DEBUG oslo_concurrency.lockutils [req-830a5418-f8ef-4ad8-9120-7ab0e216949c req-75729399-b9fd-4e0f-9881-c7db590ec03b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.917 2 DEBUG oslo_concurrency.lockutils [req-830a5418-f8ef-4ad8-9120-7ab0e216949c req-75729399-b9fd-4e0f-9881-c7db590ec03b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:21 compute-0 nova_compute[192903]: 2025-10-06 14:04:21.917 2 DEBUG nova.compute.manager [req-830a5418-f8ef-4ad8-9120-7ab0e216949c req-75729399-b9fd-4e0f-9881-c7db590ec03b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Processing event network-vif-plugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.371 2 DEBUG nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.378 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.384 2 INFO nova.virt.libvirt.driver [-] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Instance spawned successfully.
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.385 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.900 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.900 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.901 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.901 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.901 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.902 2 DEBUG nova.virt.libvirt.driver [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:04:22 compute-0 nova_compute[192903]: 2025-10-06 14:04:22.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:23 compute-0 nova_compute[192903]: 2025-10-06 14:04:23.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:23 compute-0 nova_compute[192903]: 2025-10-06 14:04:23.413 2 INFO nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Took 12.27 seconds to spawn the instance on the hypervisor.
Oct 06 14:04:23 compute-0 nova_compute[192903]: 2025-10-06 14:04:23.414 2 DEBUG nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:04:23 compute-0 nova_compute[192903]: 2025-10-06 14:04:23.956 2 INFO nova.compute.manager [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Took 17.54 seconds to build instance.
Oct 06 14:04:24 compute-0 nova_compute[192903]: 2025-10-06 14:04:24.004 2 DEBUG nova.compute.manager [req-65711e15-a13d-407a-a7cc-5dd719b368b9 req-0e15682b-e037-4d88-8c68-b7b07dc0b5d0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Received event network-vif-plugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:04:24 compute-0 nova_compute[192903]: 2025-10-06 14:04:24.004 2 DEBUG oslo_concurrency.lockutils [req-65711e15-a13d-407a-a7cc-5dd719b368b9 req-0e15682b-e037-4d88-8c68-b7b07dc0b5d0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:24 compute-0 nova_compute[192903]: 2025-10-06 14:04:24.005 2 DEBUG oslo_concurrency.lockutils [req-65711e15-a13d-407a-a7cc-5dd719b368b9 req-0e15682b-e037-4d88-8c68-b7b07dc0b5d0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:24 compute-0 nova_compute[192903]: 2025-10-06 14:04:24.005 2 DEBUG oslo_concurrency.lockutils [req-65711e15-a13d-407a-a7cc-5dd719b368b9 req-0e15682b-e037-4d88-8c68-b7b07dc0b5d0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:24 compute-0 nova_compute[192903]: 2025-10-06 14:04:24.006 2 DEBUG nova.compute.manager [req-65711e15-a13d-407a-a7cc-5dd719b368b9 req-0e15682b-e037-4d88-8c68-b7b07dc0b5d0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] No waiting events found dispatching network-vif-plugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:04:24 compute-0 nova_compute[192903]: 2025-10-06 14:04:24.006 2 WARNING nova.compute.manager [req-65711e15-a13d-407a-a7cc-5dd719b368b9 req-0e15682b-e037-4d88-8c68-b7b07dc0b5d0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Received unexpected event network-vif-plugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 for instance with vm_state active and task_state None.
Oct 06 14:04:24 compute-0 podman[217805]: 2025-10-06 14:04:24.240703024 +0000 UTC m=+0.095710879 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:04:24 compute-0 nova_compute[192903]: 2025-10-06 14:04:24.461 2 DEBUG oslo_concurrency.lockutils [None req-71908777-a63e-4fb9-b06a-b26e6b1282e9 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.057s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:27 compute-0 podman[217825]: 2025-10-06 14:04:27.225419606 +0000 UTC m=+0.083824288 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 06 14:04:27 compute-0 nova_compute[192903]: 2025-10-06 14:04:27.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:28 compute-0 nova_compute[192903]: 2025-10-06 14:04:28.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:29 compute-0 podman[203308]: time="2025-10-06T14:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:04:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:04:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3461 "" "Go-http-client/1.1"
Oct 06 14:04:31 compute-0 openstack_network_exporter[205500]: ERROR   14:04:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:04:31 compute-0 openstack_network_exporter[205500]: ERROR   14:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:04:31 compute-0 openstack_network_exporter[205500]: ERROR   14:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:04:31 compute-0 openstack_network_exporter[205500]: ERROR   14:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:04:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:04:31 compute-0 openstack_network_exporter[205500]: ERROR   14:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:04:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:04:32 compute-0 nova_compute[192903]: 2025-10-06 14:04:32.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:33 compute-0 nova_compute[192903]: 2025-10-06 14:04:33.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:34 compute-0 ovn_controller[95205]: 2025-10-06T14:04:34Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:77:4a:68 10.100.0.7
Oct 06 14:04:34 compute-0 ovn_controller[95205]: 2025-10-06T14:04:34Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:4a:68 10.100.0.7
Oct 06 14:04:37 compute-0 nova_compute[192903]: 2025-10-06 14:04:37.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:38 compute-0 nova_compute[192903]: 2025-10-06 14:04:38.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:42 compute-0 nova_compute[192903]: 2025-10-06 14:04:42.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:43 compute-0 podman[217859]: 2025-10-06 14:04:43.242137244 +0000 UTC m=+0.087648148 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:04:43 compute-0 nova_compute[192903]: 2025-10-06 14:04:43.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:45 compute-0 nova_compute[192903]: 2025-10-06 14:04:45.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:04:46 compute-0 podman[217886]: 2025-10-06 14:04:46.240829482 +0000 UTC m=+0.088066570 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 06 14:04:46 compute-0 podman[217887]: 2025-10-06 14:04:46.241529281 +0000 UTC m=+0.078124070 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:04:46 compute-0 podman[217885]: 2025-10-06 14:04:46.280111267 +0000 UTC m=+0.128991089 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:04:46 compute-0 nova_compute[192903]: 2025-10-06 14:04:46.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:04:47 compute-0 nova_compute[192903]: 2025-10-06 14:04:47.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:04:47 compute-0 nova_compute[192903]: 2025-10-06 14:04:47.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:48 compute-0 nova_compute[192903]: 2025-10-06 14:04:48.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:48 compute-0 nova_compute[192903]: 2025-10-06 14:04:48.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:04:48 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:48.980 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:04:48 compute-0 nova_compute[192903]: 2025-10-06 14:04:48.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:48 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:48.982 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:04:49 compute-0 nova_compute[192903]: 2025-10-06 14:04:49.092 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:04:49 compute-0 nova_compute[192903]: 2025-10-06 14:04:49.093 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:04:49 compute-0 nova_compute[192903]: 2025-10-06 14:04:49.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:04:49 compute-0 nova_compute[192903]: 2025-10-06 14:04:49.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:04:50 compute-0 nova_compute[192903]: 2025-10-06 14:04:50.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:50 compute-0 nova_compute[192903]: 2025-10-06 14:04:50.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:50 compute-0 nova_compute[192903]: 2025-10-06 14:04:50.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:50 compute-0 nova_compute[192903]: 2025-10-06 14:04:50.099 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.153 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.213 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.215 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.309 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.315 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.376 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.378 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.450 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.625 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.627 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.650 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.651 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5543MB free_disk=73.24800109863281GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.651 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:51 compute-0 nova_compute[192903]: 2025-10-06 14:04:51.651 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:52 compute-0 nova_compute[192903]: 2025-10-06 14:04:52.738 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 46246aa4-aa4f-4a8e-93ba-5fc685a531a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:04:52 compute-0 nova_compute[192903]: 2025-10-06 14:04:52.739 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance ea9b1b2c-e123-4a8b-a2ef-f29e14732d20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:04:52 compute-0 nova_compute[192903]: 2025-10-06 14:04:52.739 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:04:52 compute-0 nova_compute[192903]: 2025-10-06 14:04:52.739 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:04:51 up  1:05,  0 user,  load average: 0.48, 0.35, 0.39\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_20952eb66a9c4fd2905273fb8f800689': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:04:52 compute-0 nova_compute[192903]: 2025-10-06 14:04:52.843 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:04:52 compute-0 nova_compute[192903]: 2025-10-06 14:04:52.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:53 compute-0 nova_compute[192903]: 2025-10-06 14:04:53.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:53 compute-0 nova_compute[192903]: 2025-10-06 14:04:53.357 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:04:53 compute-0 nova_compute[192903]: 2025-10-06 14:04:53.867 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:04:53 compute-0 nova_compute[192903]: 2025-10-06 14:04:53.867 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.216s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:04:53.984 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:04:54 compute-0 nova_compute[192903]: 2025-10-06 14:04:54.448 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "a8449b2e-50c6-45a4-b201-210240c50968" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:54 compute-0 nova_compute[192903]: 2025-10-06 14:04:54.448 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:54 compute-0 nova_compute[192903]: 2025-10-06 14:04:54.867 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:04:54 compute-0 nova_compute[192903]: 2025-10-06 14:04:54.867 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:04:54 compute-0 nova_compute[192903]: 2025-10-06 14:04:54.953 2 DEBUG nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:04:55 compute-0 podman[217965]: 2025-10-06 14:04:55.223808398 +0000 UTC m=+0.079106376 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:04:55 compute-0 nova_compute[192903]: 2025-10-06 14:04:55.486 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:04:55 compute-0 nova_compute[192903]: 2025-10-06 14:04:55.486 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:04:55 compute-0 nova_compute[192903]: 2025-10-06 14:04:55.493 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:04:55 compute-0 nova_compute[192903]: 2025-10-06 14:04:55.493 2 INFO nova.compute.claims [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:04:56 compute-0 nova_compute[192903]: 2025-10-06 14:04:56.576 2 DEBUG nova.compute.provider_tree [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:04:57 compute-0 nova_compute[192903]: 2025-10-06 14:04:57.084 2 DEBUG nova.scheduler.client.report [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:04:57 compute-0 nova_compute[192903]: 2025-10-06 14:04:57.594 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:04:57 compute-0 nova_compute[192903]: 2025-10-06 14:04:57.595 2 DEBUG nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:04:57 compute-0 nova_compute[192903]: 2025-10-06 14:04:57.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:58 compute-0 nova_compute[192903]: 2025-10-06 14:04:58.105 2 DEBUG nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:04:58 compute-0 nova_compute[192903]: 2025-10-06 14:04:58.105 2 DEBUG nova.network.neutron [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:04:58 compute-0 nova_compute[192903]: 2025-10-06 14:04:58.105 2 WARNING neutronclient.v2_0.client [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:04:58 compute-0 nova_compute[192903]: 2025-10-06 14:04:58.106 2 WARNING neutronclient.v2_0.client [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:04:58 compute-0 podman[217985]: 2025-10-06 14:04:58.215033212 +0000 UTC m=+0.074380688 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Oct 06 14:04:58 compute-0 nova_compute[192903]: 2025-10-06 14:04:58.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:04:58 compute-0 nova_compute[192903]: 2025-10-06 14:04:58.615 2 INFO nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:04:59 compute-0 nova_compute[192903]: 2025-10-06 14:04:59.123 2 DEBUG nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:04:59 compute-0 nova_compute[192903]: 2025-10-06 14:04:59.204 2 DEBUG nova.network.neutron [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Successfully created port: 1a60ab0b-06f0-436a-a116-c1d328ad3203 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:04:59 compute-0 podman[203308]: time="2025-10-06T14:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:04:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:04:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3459 "" "Go-http-client/1.1"
Oct 06 14:04:59 compute-0 nova_compute[192903]: 2025-10-06 14:04:59.777 2 DEBUG nova.network.neutron [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Successfully updated port: 1a60ab0b-06f0-436a-a116-c1d328ad3203 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:04:59 compute-0 nova_compute[192903]: 2025-10-06 14:04:59.836 2 DEBUG nova.compute.manager [req-10902a1d-6fa8-4223-b332-1d11486135a9 req-ee8703e3-bd09-4824-95c5-a0566b75aef4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Received event network-changed-1a60ab0b-06f0-436a-a116-c1d328ad3203 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:04:59 compute-0 nova_compute[192903]: 2025-10-06 14:04:59.837 2 DEBUG nova.compute.manager [req-10902a1d-6fa8-4223-b332-1d11486135a9 req-ee8703e3-bd09-4824-95c5-a0566b75aef4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Refreshing instance network info cache due to event network-changed-1a60ab0b-06f0-436a-a116-c1d328ad3203. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:04:59 compute-0 nova_compute[192903]: 2025-10-06 14:04:59.837 2 DEBUG oslo_concurrency.lockutils [req-10902a1d-6fa8-4223-b332-1d11486135a9 req-ee8703e3-bd09-4824-95c5-a0566b75aef4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-a8449b2e-50c6-45a4-b201-210240c50968" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:04:59 compute-0 nova_compute[192903]: 2025-10-06 14:04:59.837 2 DEBUG oslo_concurrency.lockutils [req-10902a1d-6fa8-4223-b332-1d11486135a9 req-ee8703e3-bd09-4824-95c5-a0566b75aef4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-a8449b2e-50c6-45a4-b201-210240c50968" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:04:59 compute-0 nova_compute[192903]: 2025-10-06 14:04:59.837 2 DEBUG nova.network.neutron [req-10902a1d-6fa8-4223-b332-1d11486135a9 req-ee8703e3-bd09-4824-95c5-a0566b75aef4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Refreshing network info cache for port 1a60ab0b-06f0-436a-a116-c1d328ad3203 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.146 2 DEBUG nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.148 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.148 2 INFO nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Creating image(s)
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.149 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "/var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.150 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "/var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.151 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "/var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.152 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.160 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.162 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.253 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.254 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.255 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.255 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.259 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.260 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.283 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "refresh_cache-a8449b2e-50c6-45a4-b201-210240c50968" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.317 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.318 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.343 2 WARNING neutronclient.v2_0.client [req-10902a1d-6fa8-4223-b332-1d11486135a9 req-ee8703e3-bd09-4824-95c5-a0566b75aef4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.348 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.348 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.349 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.402 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.404 2 DEBUG nova.virt.disk.api [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Checking if we can resize image /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.404 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.451 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.453 2 DEBUG nova.virt.disk.api [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Cannot resize image /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.453 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.454 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Ensure instance console log exists: /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.455 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.455 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.456 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.749 2 DEBUG nova.network.neutron [req-10902a1d-6fa8-4223-b332-1d11486135a9 req-ee8703e3-bd09-4824-95c5-a0566b75aef4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:05:00 compute-0 nova_compute[192903]: 2025-10-06 14:05:00.965 2 DEBUG nova.network.neutron [req-10902a1d-6fa8-4223-b332-1d11486135a9 req-ee8703e3-bd09-4824-95c5-a0566b75aef4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:05:01 compute-0 openstack_network_exporter[205500]: ERROR   14:05:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:05:01 compute-0 openstack_network_exporter[205500]: ERROR   14:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:05:01 compute-0 openstack_network_exporter[205500]: ERROR   14:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:05:01 compute-0 openstack_network_exporter[205500]: ERROR   14:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:05:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:05:01 compute-0 openstack_network_exporter[205500]: ERROR   14:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:05:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:05:01 compute-0 nova_compute[192903]: 2025-10-06 14:05:01.475 2 DEBUG oslo_concurrency.lockutils [req-10902a1d-6fa8-4223-b332-1d11486135a9 req-ee8703e3-bd09-4824-95c5-a0566b75aef4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-a8449b2e-50c6-45a4-b201-210240c50968" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:05:01 compute-0 nova_compute[192903]: 2025-10-06 14:05:01.477 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquired lock "refresh_cache-a8449b2e-50c6-45a4-b201-210240c50968" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:05:01 compute-0 nova_compute[192903]: 2025-10-06 14:05:01.477 2 DEBUG nova.network.neutron [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:05:02 compute-0 nova_compute[192903]: 2025-10-06 14:05:02.743 2 DEBUG nova.network.neutron [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:05:02 compute-0 nova_compute[192903]: 2025-10-06 14:05:02.974 2 WARNING neutronclient.v2_0.client [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:02 compute-0 nova_compute[192903]: 2025-10-06 14:05:02.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.156 2 DEBUG nova.network.neutron [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Updating instance_info_cache with network_info: [{"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.666 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Releasing lock "refresh_cache-a8449b2e-50c6-45a4-b201-210240c50968" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.667 2 DEBUG nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Instance network_info: |[{"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.671 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Start _get_guest_xml network_info=[{"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.676 2 WARNING nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.678 2 DEBUG nova.virt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1232667021', uuid='a8449b2e-50c6-45a4-b201-210240c50968'), owner=OwnerMeta(userid='4beaed30a2ec47bb9b5f6adb81ede0f7', username='tempest-TestExecuteActionsViaActuator-1260248176-project-admin', projectid='20952eb66a9c4fd2905273fb8f800689', projectname='tempest-TestExecuteActionsViaActuator-1260248176'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759759503.6781833) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.685 2 DEBUG nova.virt.libvirt.host [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.686 2 DEBUG nova.virt.libvirt.host [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.689 2 DEBUG nova.virt.libvirt.host [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.690 2 DEBUG nova.virt.libvirt.host [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.691 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.691 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.692 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.693 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.693 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.694 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.694 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.695 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.695 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.696 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.696 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.697 2 DEBUG nova.virt.hardware [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.703 2 DEBUG nova.virt.libvirt.vif [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:04:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1232667021',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1232667021',id=9,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-ofptvqyy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:04:59Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=a8449b2e-50c6-45a4-b201-210240c50968,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.703 2 DEBUG nova.network.os_vif_util [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.705 2 DEBUG nova.network.os_vif_util [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:4d:4d,bridge_name='br-int',has_traffic_filtering=True,id=1a60ab0b-06f0-436a-a116-c1d328ad3203,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a60ab0b-06') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:05:03 compute-0 nova_compute[192903]: 2025-10-06 14:05:03.706 2 DEBUG nova.objects.instance [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lazy-loading 'pci_devices' on Instance uuid a8449b2e-50c6-45a4-b201-210240c50968 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.218 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <uuid>a8449b2e-50c6-45a4-b201-210240c50968</uuid>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <name>instance-00000009</name>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1232667021</nova:name>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:05:03</nova:creationTime>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:05:04 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:05:04 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:user uuid="4beaed30a2ec47bb9b5f6adb81ede0f7">tempest-TestExecuteActionsViaActuator-1260248176-project-admin</nova:user>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:project uuid="20952eb66a9c4fd2905273fb8f800689">tempest-TestExecuteActionsViaActuator-1260248176</nova:project>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         <nova:port uuid="1a60ab0b-06f0-436a-a116-c1d328ad3203">
Oct 06 14:05:04 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <system>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <entry name="serial">a8449b2e-50c6-45a4-b201-210240c50968</entry>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <entry name="uuid">a8449b2e-50c6-45a4-b201-210240c50968</entry>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     </system>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <os>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   </os>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <features>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   </features>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk.config"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:f1:4d:4d"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <target dev="tap1a60ab0b-06"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/console.log" append="off"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <video>
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     </video>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:05:04 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:05:04 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:05:04 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:05:04 compute-0 nova_compute[192903]: </domain>
Oct 06 14:05:04 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.220 2 DEBUG nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Preparing to wait for external event network-vif-plugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.221 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "a8449b2e-50c6-45a4-b201-210240c50968-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.222 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.222 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.224 2 DEBUG nova.virt.libvirt.vif [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:04:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1232667021',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1232667021',id=9,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-ofptvqyy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:04:59Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=a8449b2e-50c6-45a4-b201-210240c50968,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.224 2 DEBUG nova.network.os_vif_util [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.225 2 DEBUG nova.network.os_vif_util [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:4d:4d,bridge_name='br-int',has_traffic_filtering=True,id=1a60ab0b-06f0-436a-a116-c1d328ad3203,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a60ab0b-06') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.226 2 DEBUG os_vif [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:4d:4d,bridge_name='br-int',has_traffic_filtering=True,id=1a60ab0b-06f0-436a-a116-c1d328ad3203,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a60ab0b-06') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.228 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.228 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.230 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a82be443-ad82-5c46-81c5-2c657cd8395f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.288 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a60ab0b-06, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.289 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1a60ab0b-06, col_values=(('qos', UUID('e6c1766c-d5bd-4d76-8f0c-f4dab51cb4d0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.289 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1a60ab0b-06, col_values=(('external_ids', {'iface-id': '1a60ab0b-06f0-436a-a116-c1d328ad3203', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:4d:4d', 'vm-uuid': 'a8449b2e-50c6-45a4-b201-210240c50968'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:04 compute-0 NetworkManager[52035]: <info>  [1759759504.2930] manager: (tap1a60ab0b-06): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:04 compute-0 nova_compute[192903]: 2025-10-06 14:05:04.302 2 INFO os_vif [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:4d:4d,bridge_name='br-int',has_traffic_filtering=True,id=1a60ab0b-06f0-436a-a116-c1d328ad3203,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a60ab0b-06')
Oct 06 14:05:05 compute-0 nova_compute[192903]: 2025-10-06 14:05:05.840 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:05:05 compute-0 nova_compute[192903]: 2025-10-06 14:05:05.841 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:05:05 compute-0 nova_compute[192903]: 2025-10-06 14:05:05.842 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] No VIF found with MAC fa:16:3e:f1:4d:4d, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:05:05 compute-0 nova_compute[192903]: 2025-10-06 14:05:05.843 2 INFO nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Using config drive
Oct 06 14:05:06 compute-0 nova_compute[192903]: 2025-10-06 14:05:06.359 2 WARNING neutronclient.v2_0.client [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:06 compute-0 nova_compute[192903]: 2025-10-06 14:05:06.741 2 INFO nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Creating config drive at /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk.config
Oct 06 14:05:06 compute-0 nova_compute[192903]: 2025-10-06 14:05:06.752 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpxeoflxv_ execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:06 compute-0 nova_compute[192903]: 2025-10-06 14:05:06.894 2 DEBUG oslo_concurrency.processutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpxeoflxv_" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:06 compute-0 NetworkManager[52035]: <info>  [1759759506.9785] manager: (tap1a60ab0b-06): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Oct 06 14:05:06 compute-0 kernel: tap1a60ab0b-06: entered promiscuous mode
Oct 06 14:05:06 compute-0 ovn_controller[95205]: 2025-10-06T14:05:06Z|00058|binding|INFO|Claiming lport 1a60ab0b-06f0-436a-a116-c1d328ad3203 for this chassis.
Oct 06 14:05:06 compute-0 nova_compute[192903]: 2025-10-06 14:05:06.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:06 compute-0 ovn_controller[95205]: 2025-10-06T14:05:06Z|00059|binding|INFO|1a60ab0b-06f0-436a-a116-c1d328ad3203: Claiming fa:16:3e:f1:4d:4d 10.100.0.12
Oct 06 14:05:07 compute-0 nova_compute[192903]: 2025-10-06 14:05:07.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:07 compute-0 ovn_controller[95205]: 2025-10-06T14:05:07Z|00060|binding|INFO|Setting lport 1a60ab0b-06f0-436a-a116-c1d328ad3203 ovn-installed in OVS
Oct 06 14:05:07 compute-0 nova_compute[192903]: 2025-10-06 14:05:07.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:07 compute-0 nova_compute[192903]: 2025-10-06 14:05:07.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:07 compute-0 ovn_controller[95205]: 2025-10-06T14:05:07Z|00061|binding|INFO|Setting lport 1a60ab0b-06f0-436a-a116-c1d328ad3203 up in Southbound
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.015 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:4d:4d 10.100.0.12'], port_security=['fa:16:3e:f1:4d:4d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a8449b2e-50c6-45a4-b201-210240c50968', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=1a60ab0b-06f0-436a-a116-c1d328ad3203) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.016 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 1a60ab0b-06f0-436a-a116-c1d328ad3203 in datapath 69d92bff-38df-455c-b731-a2864652e2a5 bound to our chassis
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.018 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:05:07 compute-0 systemd-udevd[218040]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.033 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[535d0345-3bf5-4749-8368-b6deaba6f003]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:07 compute-0 systemd-machined[152985]: New machine qemu-4-instance-00000009.
Oct 06 14:05:07 compute-0 NetworkManager[52035]: <info>  [1759759507.0417] device (tap1a60ab0b-06): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:05:07 compute-0 NetworkManager[52035]: <info>  [1759759507.0434] device (tap1a60ab0b-06): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:05:07 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000009.
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.075 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[618fcc6c-3401-494d-b798-f125e2d5a09c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.078 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[b83da08c-9b3d-4ce9-bde5-d8388a1fffc7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.114 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc3181f-5ac1-4f3b-a95d-e684caab589e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.137 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[34411fc9-cefc-4f77-8e42-98e9cf5cbdf1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 31438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218054, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.154 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[dee658ce-f88d-4b05-8493-e70c1107d550]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384410, 'tstamp': 384410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218056, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384414, 'tstamp': 384414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218056, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.155 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:07 compute-0 nova_compute[192903]: 2025-10-06 14:05:07.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.159 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69d92bff-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.159 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.159 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69d92bff-30, col_values=(('external_ids', {'iface-id': '4cb572c5-2fe1-4cc2-9aac-d044653b4542'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.160 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:05:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:07.161 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[338b18d0-7dc3-449e-af9c-e28b221427b4]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-69d92bff-38df-455c-b731-a2864652e2a5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 69d92bff-38df-455c-b731-a2864652e2a5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:07 compute-0 nova_compute[192903]: 2025-10-06 14:05:07.285 2 DEBUG nova.compute.manager [req-46a25715-39b2-4610-a3ff-16b2a39d543c req-76addfef-e806-4c2f-93e5-e69f12c28e52 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Received event network-vif-plugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:05:07 compute-0 nova_compute[192903]: 2025-10-06 14:05:07.285 2 DEBUG oslo_concurrency.lockutils [req-46a25715-39b2-4610-a3ff-16b2a39d543c req-76addfef-e806-4c2f-93e5-e69f12c28e52 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "a8449b2e-50c6-45a4-b201-210240c50968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:07 compute-0 nova_compute[192903]: 2025-10-06 14:05:07.286 2 DEBUG oslo_concurrency.lockutils [req-46a25715-39b2-4610-a3ff-16b2a39d543c req-76addfef-e806-4c2f-93e5-e69f12c28e52 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:07 compute-0 nova_compute[192903]: 2025-10-06 14:05:07.286 2 DEBUG oslo_concurrency.lockutils [req-46a25715-39b2-4610-a3ff-16b2a39d543c req-76addfef-e806-4c2f-93e5-e69f12c28e52 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:07 compute-0 nova_compute[192903]: 2025-10-06 14:05:07.286 2 DEBUG nova.compute.manager [req-46a25715-39b2-4610-a3ff-16b2a39d543c req-76addfef-e806-4c2f-93e5-e69f12c28e52 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Processing event network-vif-plugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.110 2 DEBUG nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.115 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.119 2 INFO nova.virt.libvirt.driver [-] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Instance spawned successfully.
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.119 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.650 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.651 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.651 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.652 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.652 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:05:08 compute-0 nova_compute[192903]: 2025-10-06 14:05:08.652 2 DEBUG nova.virt.libvirt.driver [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.164 2 INFO nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Took 9.02 seconds to spawn the instance on the hypervisor.
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.164 2 DEBUG nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.356 2 DEBUG nova.compute.manager [req-14561635-8da3-478c-abf3-a5d80b3de587 req-404ffbb8-fc8f-4fea-9123-9d0d61b3eae5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Received event network-vif-plugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.356 2 DEBUG oslo_concurrency.lockutils [req-14561635-8da3-478c-abf3-a5d80b3de587 req-404ffbb8-fc8f-4fea-9123-9d0d61b3eae5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "a8449b2e-50c6-45a4-b201-210240c50968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.356 2 DEBUG oslo_concurrency.lockutils [req-14561635-8da3-478c-abf3-a5d80b3de587 req-404ffbb8-fc8f-4fea-9123-9d0d61b3eae5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.357 2 DEBUG oslo_concurrency.lockutils [req-14561635-8da3-478c-abf3-a5d80b3de587 req-404ffbb8-fc8f-4fea-9123-9d0d61b3eae5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.357 2 DEBUG nova.compute.manager [req-14561635-8da3-478c-abf3-a5d80b3de587 req-404ffbb8-fc8f-4fea-9123-9d0d61b3eae5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] No waiting events found dispatching network-vif-plugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.357 2 WARNING nova.compute.manager [req-14561635-8da3-478c-abf3-a5d80b3de587 req-404ffbb8-fc8f-4fea-9123-9d0d61b3eae5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Received unexpected event network-vif-plugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 for instance with vm_state active and task_state None.
Oct 06 14:05:09 compute-0 nova_compute[192903]: 2025-10-06 14:05:09.701 2 INFO nova.compute.manager [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Took 14.24 seconds to build instance.
Oct 06 14:05:10 compute-0 nova_compute[192903]: 2025-10-06 14:05:10.210 2 DEBUG oslo_concurrency.lockutils [None req-9bf2ddc3-5329-4856-80b6-5b016cb65a4e 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.761s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:11.356 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:11.357 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:11.357 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:13 compute-0 nova_compute[192903]: 2025-10-06 14:05:13.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:14 compute-0 podman[218065]: 2025-10-06 14:05:14.218067601 +0000 UTC m=+0.075265153 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:05:14 compute-0 nova_compute[192903]: 2025-10-06 14:05:14.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:17 compute-0 podman[218091]: 2025-10-06 14:05:17.221930358 +0000 UTC m=+0.075840878 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 06 14:05:17 compute-0 podman[218092]: 2025-10-06 14:05:17.253124574 +0000 UTC m=+0.100720132 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 14:05:17 compute-0 podman[218090]: 2025-10-06 14:05:17.283683183 +0000 UTC m=+0.137496570 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Oct 06 14:05:18 compute-0 nova_compute[192903]: 2025-10-06 14:05:18.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:19 compute-0 nova_compute[192903]: 2025-10-06 14:05:19.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:19 compute-0 ovn_controller[95205]: 2025-10-06T14:05:19Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f1:4d:4d 10.100.0.12
Oct 06 14:05:19 compute-0 ovn_controller[95205]: 2025-10-06T14:05:19Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f1:4d:4d 10.100.0.12
Oct 06 14:05:22 compute-0 nova_compute[192903]: 2025-10-06 14:05:22.982 2 DEBUG nova.compute.manager [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6169
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.223 2 DEBUG nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Creating tmpfile /var/lib/nova/instances/tmpwsh5qanu to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.224 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.318 2 DEBUG nova.compute.manager [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwsh5qanu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.341 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.342 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.508 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.508 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.849 2 INFO nova.compute.rpcapi [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Oct 06 14:05:23 compute-0 nova_compute[192903]: 2025-10-06 14:05:23.850 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:05:24 compute-0 nova_compute[192903]: 2025-10-06 14:05:24.023 2 DEBUG nova.objects.instance [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'pci_requests' on Instance uuid e16d2f31-6d64-4d53-8f79-78ea4befde4a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:05:24 compute-0 nova_compute[192903]: 2025-10-06 14:05:24.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:24 compute-0 nova_compute[192903]: 2025-10-06 14:05:24.530 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:05:24 compute-0 nova_compute[192903]: 2025-10-06 14:05:24.531 2 INFO nova.compute.claims [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:05:24 compute-0 nova_compute[192903]: 2025-10-06 14:05:24.531 2 DEBUG nova.objects.instance [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'resources' on Instance uuid e16d2f31-6d64-4d53-8f79-78ea4befde4a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:05:25 compute-0 nova_compute[192903]: 2025-10-06 14:05:25.037 2 DEBUG nova.objects.base [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<e16d2f31-6d64-4d53-8f79-78ea4befde4a> lazy-loaded attributes: pci_requests,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:05:25 compute-0 nova_compute[192903]: 2025-10-06 14:05:25.038 2 DEBUG nova.objects.instance [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'numa_topology' on Instance uuid e16d2f31-6d64-4d53-8f79-78ea4befde4a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:05:25 compute-0 nova_compute[192903]: 2025-10-06 14:05:25.582 2 DEBUG nova.objects.base [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<e16d2f31-6d64-4d53-8f79-78ea4befde4a> lazy-loaded attributes: pci_requests,resources,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:05:25 compute-0 nova_compute[192903]: 2025-10-06 14:05:25.582 2 DEBUG nova.objects.instance [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'pci_devices' on Instance uuid e16d2f31-6d64-4d53-8f79-78ea4befde4a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:05:25 compute-0 nova_compute[192903]: 2025-10-06 14:05:25.865 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:26 compute-0 nova_compute[192903]: 2025-10-06 14:05:26.090 2 DEBUG nova.objects.base [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<e16d2f31-6d64-4d53-8f79-78ea4befde4a> lazy-loaded attributes: pci_requests,resources,numa_topology,pci_devices wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:05:26 compute-0 podman[218185]: 2025-10-06 14:05:26.22075015 +0000 UTC m=+0.083106734 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 06 14:05:26 compute-0 nova_compute[192903]: 2025-10-06 14:05:26.602 2 INFO nova.compute.resource_tracker [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Updating resource usage from migration dee5880d-3d5b-4f77-ae9c-7a0bec90ba44
Oct 06 14:05:26 compute-0 nova_compute[192903]: 2025-10-06 14:05:26.603 2 DEBUG nova.compute.resource_tracker [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Starting to track incoming migration dee5880d-3d5b-4f77-ae9c-7a0bec90ba44 with flavor 8cb06c85-e9e7-417f-906b-1f7cf29f7de9 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 06 14:05:27 compute-0 nova_compute[192903]: 2025-10-06 14:05:27.200 2 DEBUG nova.compute.provider_tree [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:05:27 compute-0 nova_compute[192903]: 2025-10-06 14:05:27.711 2 DEBUG nova.scheduler.client.report [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:05:28 compute-0 nova_compute[192903]: 2025-10-06 14:05:28.222 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 4.713s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:28 compute-0 nova_compute[192903]: 2025-10-06 14:05:28.222 2 INFO nova.compute.manager [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Migrating
Oct 06 14:05:28 compute-0 nova_compute[192903]: 2025-10-06 14:05:28.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:29 compute-0 podman[218205]: 2025-10-06 14:05:29.232934303 +0000 UTC m=+0.085484369 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=)
Oct 06 14:05:29 compute-0 nova_compute[192903]: 2025-10-06 14:05:29.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:29 compute-0 nova_compute[192903]: 2025-10-06 14:05:29.608 2 DEBUG nova.compute.manager [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwsh5qanu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9453c63e-8e53-4d5f-9571-c0dfe2365ef9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:05:29 compute-0 podman[203308]: time="2025-10-06T14:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:05:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:05:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3461 "" "Go-http-client/1.1"
Oct 06 14:05:30 compute-0 nova_compute[192903]: 2025-10-06 14:05:30.628 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-9453c63e-8e53-4d5f-9571-c0dfe2365ef9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:05:30 compute-0 nova_compute[192903]: 2025-10-06 14:05:30.629 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-9453c63e-8e53-4d5f-9571-c0dfe2365ef9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:05:30 compute-0 nova_compute[192903]: 2025-10-06 14:05:30.629 2 DEBUG nova.network.neutron [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:05:31 compute-0 nova_compute[192903]: 2025-10-06 14:05:31.139 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:31 compute-0 openstack_network_exporter[205500]: ERROR   14:05:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:05:31 compute-0 openstack_network_exporter[205500]: ERROR   14:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:05:31 compute-0 openstack_network_exporter[205500]: ERROR   14:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:05:31 compute-0 openstack_network_exporter[205500]: ERROR   14:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:05:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:05:31 compute-0 openstack_network_exporter[205500]: ERROR   14:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:05:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:05:32 compute-0 nova_compute[192903]: 2025-10-06 14:05:32.021 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:32 compute-0 sshd-session[218228]: Accepted publickey for nova from 192.168.122.101 port 48236 ssh2: ECDSA SHA256:XfgxSMU36lLkRyylGtLNNBmata20M8em0i/7ZVz7Bx4
Oct 06 14:05:32 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Oct 06 14:05:32 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct 06 14:05:32 compute-0 systemd-logind[789]: New session 29 of user nova.
Oct 06 14:05:32 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct 06 14:05:32 compute-0 systemd[1]: Starting User Manager for UID 42436...
Oct 06 14:05:32 compute-0 systemd[218233]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 06 14:05:32 compute-0 systemd[218233]: Queued start job for default target Main User Target.
Oct 06 14:05:32 compute-0 systemd[218233]: Created slice User Application Slice.
Oct 06 14:05:32 compute-0 systemd[218233]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 06 14:05:32 compute-0 systemd[218233]: Started Daily Cleanup of User's Temporary Directories.
Oct 06 14:05:32 compute-0 systemd[218233]: Reached target Paths.
Oct 06 14:05:32 compute-0 systemd[218233]: Reached target Timers.
Oct 06 14:05:32 compute-0 systemd[218233]: Starting D-Bus User Message Bus Socket...
Oct 06 14:05:32 compute-0 systemd[218233]: Starting Create User's Volatile Files and Directories...
Oct 06 14:05:32 compute-0 systemd[218233]: Listening on D-Bus User Message Bus Socket.
Oct 06 14:05:32 compute-0 systemd[218233]: Reached target Sockets.
Oct 06 14:05:32 compute-0 systemd[218233]: Finished Create User's Volatile Files and Directories.
Oct 06 14:05:32 compute-0 systemd[218233]: Reached target Basic System.
Oct 06 14:05:32 compute-0 systemd[218233]: Reached target Main User Target.
Oct 06 14:05:32 compute-0 systemd[218233]: Startup finished in 177ms.
Oct 06 14:05:32 compute-0 systemd[1]: Started User Manager for UID 42436.
Oct 06 14:05:32 compute-0 systemd[1]: Started Session 29 of User nova.
Oct 06 14:05:32 compute-0 sshd-session[218228]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 06 14:05:32 compute-0 sshd-session[218248]: Received disconnect from 192.168.122.101 port 48236:11: disconnected by user
Oct 06 14:05:32 compute-0 sshd-session[218248]: Disconnected from user nova 192.168.122.101 port 48236
Oct 06 14:05:32 compute-0 sshd-session[218228]: pam_unix(sshd:session): session closed for user nova
Oct 06 14:05:32 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Oct 06 14:05:32 compute-0 systemd-logind[789]: Session 29 logged out. Waiting for processes to exit.
Oct 06 14:05:32 compute-0 systemd-logind[789]: Removed session 29.
Oct 06 14:05:32 compute-0 sshd-session[218250]: Accepted publickey for nova from 192.168.122.101 port 48242 ssh2: ECDSA SHA256:XfgxSMU36lLkRyylGtLNNBmata20M8em0i/7ZVz7Bx4
Oct 06 14:05:32 compute-0 systemd-logind[789]: New session 31 of user nova.
Oct 06 14:05:32 compute-0 systemd[1]: Started Session 31 of User nova.
Oct 06 14:05:32 compute-0 sshd-session[218250]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 06 14:05:32 compute-0 nova_compute[192903]: 2025-10-06 14:05:32.750 2 DEBUG nova.network.neutron [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Updating instance_info_cache with network_info: [{"id": "0d38f548-fe64-428b-beab-0b96200911a7", "address": "fa:16:3e:ad:00:49", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d38f548-fe", "ovs_interfaceid": "0d38f548-fe64-428b-beab-0b96200911a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:05:32 compute-0 sshd-session[218253]: Received disconnect from 192.168.122.101 port 48242:11: disconnected by user
Oct 06 14:05:32 compute-0 sshd-session[218253]: Disconnected from user nova 192.168.122.101 port 48242
Oct 06 14:05:32 compute-0 sshd-session[218250]: pam_unix(sshd:session): session closed for user nova
Oct 06 14:05:32 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Oct 06 14:05:32 compute-0 systemd-logind[789]: Session 31 logged out. Waiting for processes to exit.
Oct 06 14:05:32 compute-0 systemd-logind[789]: Removed session 31.
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.268 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-9453c63e-8e53-4d5f-9571-c0dfe2365ef9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.279 2 DEBUG nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwsh5qanu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9453c63e-8e53-4d5f-9571-c0dfe2365ef9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.279 2 DEBUG nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Creating instance directory: /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.280 2 DEBUG nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Creating disk.info with the contents: {'/var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk': 'qcow2', '/var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.280 2 DEBUG nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.281 2 DEBUG nova.objects.instance [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9453c63e-8e53-4d5f-9571-c0dfe2365ef9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.786 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.791 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.794 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.867 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.868 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.869 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.869 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.873 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.874 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.924 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.925 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.965 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.966 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:33 compute-0 nova_compute[192903]: 2025-10-06 14:05:33.966 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.036 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.037 2 DEBUG nova.virt.disk.api [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.037 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.088 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.089 2 DEBUG nova.virt.disk.api [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.089 2 DEBUG nova.objects.instance [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid 9453c63e-8e53-4d5f-9571-c0dfe2365ef9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.599 2 DEBUG nova.objects.base [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<9453c63e-8e53-4d5f-9571-c0dfe2365ef9> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.600 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.641 2 DEBUG oslo_concurrency.processutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk.config 497664" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.642 2 DEBUG nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.645 2 DEBUG nova.virt.libvirt.vif [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:03:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1247272142',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1247272142',id=6,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:03:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-dvjs40vx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:03:59Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=9453c63e-8e53-4d5f-9571-c0dfe2365ef9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d38f548-fe64-428b-beab-0b96200911a7", "address": "fa:16:3e:ad:00:49", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0d38f548-fe", "ovs_interfaceid": "0d38f548-fe64-428b-beab-0b96200911a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.645 2 DEBUG nova.network.os_vif_util [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "0d38f548-fe64-428b-beab-0b96200911a7", "address": "fa:16:3e:ad:00:49", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0d38f548-fe", "ovs_interfaceid": "0d38f548-fe64-428b-beab-0b96200911a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.647 2 DEBUG nova.network.os_vif_util [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:00:49,bridge_name='br-int',has_traffic_filtering=True,id=0d38f548-fe64-428b-beab-0b96200911a7,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d38f548-fe') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.648 2 DEBUG os_vif [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:00:49,bridge_name='br-int',has_traffic_filtering=True,id=0d38f548-fe64-428b-beab-0b96200911a7,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d38f548-fe') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.650 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.650 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.653 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '37919e32-96db-5e67-8811-7936b54bd797', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.661 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d38f548-fe, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.662 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap0d38f548-fe, col_values=(('qos', UUID('1395c2c7-da8f-414f-8bec-13d21e18cc6c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.662 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap0d38f548-fe, col_values=(('external_ids', {'iface-id': '0d38f548-fe64-428b-beab-0b96200911a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:00:49', 'vm-uuid': '9453c63e-8e53-4d5f-9571-c0dfe2365ef9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:34 compute-0 NetworkManager[52035]: <info>  [1759759534.6658] manager: (tap0d38f548-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.675 2 INFO os_vif [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:00:49,bridge_name='br-int',has_traffic_filtering=True,id=0d38f548-fe64-428b-beab-0b96200911a7,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d38f548-fe')
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.676 2 DEBUG nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.677 2 DEBUG nova.compute.manager [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwsh5qanu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9453c63e-8e53-4d5f-9571-c0dfe2365ef9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.678 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:34 compute-0 nova_compute[192903]: 2025-10-06 14:05:34.787 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:35 compute-0 nova_compute[192903]: 2025-10-06 14:05:35.207 2 DEBUG nova.compute.manager [req-9735243f-d948-4c3b-aee6-af8bfcfe54bd req-5c8dc9e1-0389-4b6b-a21f-2a9d4699fc81 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:05:35 compute-0 nova_compute[192903]: 2025-10-06 14:05:35.207 2 DEBUG oslo_concurrency.lockutils [req-9735243f-d948-4c3b-aee6-af8bfcfe54bd req-5c8dc9e1-0389-4b6b-a21f-2a9d4699fc81 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:35 compute-0 nova_compute[192903]: 2025-10-06 14:05:35.208 2 DEBUG oslo_concurrency.lockutils [req-9735243f-d948-4c3b-aee6-af8bfcfe54bd req-5c8dc9e1-0389-4b6b-a21f-2a9d4699fc81 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:35 compute-0 nova_compute[192903]: 2025-10-06 14:05:35.208 2 DEBUG oslo_concurrency.lockutils [req-9735243f-d948-4c3b-aee6-af8bfcfe54bd req-5c8dc9e1-0389-4b6b-a21f-2a9d4699fc81 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:35 compute-0 nova_compute[192903]: 2025-10-06 14:05:35.208 2 DEBUG nova.compute.manager [req-9735243f-d948-4c3b-aee6-af8bfcfe54bd req-5c8dc9e1-0389-4b6b-a21f-2a9d4699fc81 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] No waiting events found dispatching network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:05:35 compute-0 nova_compute[192903]: 2025-10-06 14:05:35.209 2 WARNING nova.compute.manager [req-9735243f-d948-4c3b-aee6-af8bfcfe54bd req-5c8dc9e1-0389-4b6b-a21f-2a9d4699fc81 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received unexpected event network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb for instance with vm_state active and task_state resize_migrating.
Oct 06 14:05:35 compute-0 nova_compute[192903]: 2025-10-06 14:05:35.448 2 DEBUG nova.network.neutron [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Port 0d38f548-fe64-428b-beab-0b96200911a7 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:05:35 compute-0 nova_compute[192903]: 2025-10-06 14:05:35.468 2 DEBUG nova.compute.manager [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwsh5qanu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9453c63e-8e53-4d5f-9571-c0dfe2365ef9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:05:36 compute-0 sshd-session[218275]: Accepted publickey for nova from 192.168.122.101 port 48252 ssh2: ECDSA SHA256:XfgxSMU36lLkRyylGtLNNBmata20M8em0i/7ZVz7Bx4
Oct 06 14:05:36 compute-0 systemd-logind[789]: New session 32 of user nova.
Oct 06 14:05:36 compute-0 systemd[1]: Started Session 32 of User nova.
Oct 06 14:05:36 compute-0 sshd-session[218275]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 06 14:05:36 compute-0 sshd-session[218278]: Received disconnect from 192.168.122.101 port 48252:11: disconnected by user
Oct 06 14:05:36 compute-0 sshd-session[218278]: Disconnected from user nova 192.168.122.101 port 48252
Oct 06 14:05:36 compute-0 sshd-session[218275]: pam_unix(sshd:session): session closed for user nova
Oct 06 14:05:36 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Oct 06 14:05:36 compute-0 systemd-logind[789]: Session 32 logged out. Waiting for processes to exit.
Oct 06 14:05:36 compute-0 systemd-logind[789]: Removed session 32.
Oct 06 14:05:36 compute-0 sshd-session[218280]: Accepted publickey for nova from 192.168.122.101 port 48266 ssh2: ECDSA SHA256:XfgxSMU36lLkRyylGtLNNBmata20M8em0i/7ZVz7Bx4
Oct 06 14:05:36 compute-0 systemd-logind[789]: New session 33 of user nova.
Oct 06 14:05:36 compute-0 systemd[1]: Started Session 33 of User nova.
Oct 06 14:05:36 compute-0 sshd-session[218280]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 06 14:05:36 compute-0 sshd-session[218283]: Received disconnect from 192.168.122.101 port 48266:11: disconnected by user
Oct 06 14:05:36 compute-0 sshd-session[218283]: Disconnected from user nova 192.168.122.101 port 48266
Oct 06 14:05:36 compute-0 sshd-session[218280]: pam_unix(sshd:session): session closed for user nova
Oct 06 14:05:36 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Oct 06 14:05:36 compute-0 systemd-logind[789]: Session 33 logged out. Waiting for processes to exit.
Oct 06 14:05:36 compute-0 systemd-logind[789]: Removed session 33.
Oct 06 14:05:36 compute-0 sshd-session[218285]: Accepted publickey for nova from 192.168.122.101 port 48274 ssh2: ECDSA SHA256:XfgxSMU36lLkRyylGtLNNBmata20M8em0i/7ZVz7Bx4
Oct 06 14:05:36 compute-0 systemd-logind[789]: New session 34 of user nova.
Oct 06 14:05:37 compute-0 systemd[1]: Started Session 34 of User nova.
Oct 06 14:05:37 compute-0 sshd-session[218285]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 06 14:05:37 compute-0 sshd-session[218288]: Received disconnect from 192.168.122.101 port 48274:11: disconnected by user
Oct 06 14:05:37 compute-0 sshd-session[218288]: Disconnected from user nova 192.168.122.101 port 48274
Oct 06 14:05:37 compute-0 sshd-session[218285]: pam_unix(sshd:session): session closed for user nova
Oct 06 14:05:37 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Oct 06 14:05:37 compute-0 systemd-logind[789]: Session 34 logged out. Waiting for processes to exit.
Oct 06 14:05:37 compute-0 systemd-logind[789]: Removed session 34.
Oct 06 14:05:37 compute-0 nova_compute[192903]: 2025-10-06 14:05:37.275 2 DEBUG nova.compute.manager [req-4e26ea0a-0641-496a-b55a-6dbfcaf0b6c2 req-f2810a27-e5fb-4a8e-a028-fc7b3c17bf28 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:05:37 compute-0 nova_compute[192903]: 2025-10-06 14:05:37.275 2 DEBUG oslo_concurrency.lockutils [req-4e26ea0a-0641-496a-b55a-6dbfcaf0b6c2 req-f2810a27-e5fb-4a8e-a028-fc7b3c17bf28 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:37 compute-0 nova_compute[192903]: 2025-10-06 14:05:37.276 2 DEBUG oslo_concurrency.lockutils [req-4e26ea0a-0641-496a-b55a-6dbfcaf0b6c2 req-f2810a27-e5fb-4a8e-a028-fc7b3c17bf28 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:37 compute-0 nova_compute[192903]: 2025-10-06 14:05:37.276 2 DEBUG oslo_concurrency.lockutils [req-4e26ea0a-0641-496a-b55a-6dbfcaf0b6c2 req-f2810a27-e5fb-4a8e-a028-fc7b3c17bf28 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:37 compute-0 nova_compute[192903]: 2025-10-06 14:05:37.277 2 DEBUG nova.compute.manager [req-4e26ea0a-0641-496a-b55a-6dbfcaf0b6c2 req-f2810a27-e5fb-4a8e-a028-fc7b3c17bf28 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] No waiting events found dispatching network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:05:37 compute-0 nova_compute[192903]: 2025-10-06 14:05:37.277 2 WARNING nova.compute.manager [req-4e26ea0a-0641-496a-b55a-6dbfcaf0b6c2 req-f2810a27-e5fb-4a8e-a028-fc7b3c17bf28 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received unexpected event network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb for instance with vm_state active and task_state resize_migrating.
Oct 06 14:05:38 compute-0 nova_compute[192903]: 2025-10-06 14:05:38.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:38 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 06 14:05:38 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 06 14:05:39 compute-0 kernel: tap0d38f548-fe: entered promiscuous mode
Oct 06 14:05:39 compute-0 ovn_controller[95205]: 2025-10-06T14:05:39Z|00062|binding|INFO|Claiming lport 0d38f548-fe64-428b-beab-0b96200911a7 for this additional chassis.
Oct 06 14:05:39 compute-0 ovn_controller[95205]: 2025-10-06T14:05:39Z|00063|binding|INFO|0d38f548-fe64-428b-beab-0b96200911a7: Claiming fa:16:3e:ad:00:49 10.100.0.5
Oct 06 14:05:39 compute-0 NetworkManager[52035]: <info>  [1759759539.0245] manager: (tap0d38f548-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Oct 06 14:05:39 compute-0 nova_compute[192903]: 2025-10-06 14:05:39.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.036 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:00:49 10.100.0.5'], port_security=['fa:16:3e:ad:00:49 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9453c63e-8e53-4d5f-9571-c0dfe2365ef9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[], logical_port=0d38f548-fe64-428b-beab-0b96200911a7) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.037 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 0d38f548-fe64-428b-beab-0b96200911a7 in datapath 69d92bff-38df-455c-b731-a2864652e2a5 unbound from our chassis
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.038 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:05:39 compute-0 ovn_controller[95205]: 2025-10-06T14:05:39Z|00064|binding|INFO|Setting lport 0d38f548-fe64-428b-beab-0b96200911a7 ovn-installed in OVS
Oct 06 14:05:39 compute-0 nova_compute[192903]: 2025-10-06 14:05:39.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:39 compute-0 nova_compute[192903]: 2025-10-06 14:05:39.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.055 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[0f07d5e5-ebe9-4a87-ad1d-17292e08bc1a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:39 compute-0 systemd-udevd[218320]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:05:39 compute-0 systemd-machined[152985]: New machine qemu-5-instance-00000006.
Oct 06 14:05:39 compute-0 NetworkManager[52035]: <info>  [1759759539.0856] device (tap0d38f548-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:05:39 compute-0 NetworkManager[52035]: <info>  [1759759539.0866] device (tap0d38f548-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:05:39 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000006.
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.095 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[e05cb9c4-8a0d-4b70-ae71-9fa4c165cc4e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.098 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e00007-a4f6-46b7-9b3a-65a391a4678a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.131 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[40cbe211-0f37-4a41-bd56-5e1ef7bf964f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.145 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d71858-f018-42b6-87cb-2543dfecb539]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 9, 'rx_bytes': 1084, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 9, 'rx_bytes': 1084, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 31438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218332, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.160 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[698a3ffa-3f83-4da1-8e3f-2d16418de505]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384410, 'tstamp': 384410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218336, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384414, 'tstamp': 384414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218336, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.161 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:39 compute-0 nova_compute[192903]: 2025-10-06 14:05:39.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:39 compute-0 nova_compute[192903]: 2025-10-06 14:05:39.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.164 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69d92bff-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.164 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.165 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69d92bff-30, col_values=(('external_ids', {'iface-id': '4cb572c5-2fe1-4cc2-9aac-d044653b4542'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.165 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:05:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:39.166 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[aabc18ec-1b76-48fa-b6f3-5793e2444f1f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-69d92bff-38df-455c-b731-a2864652e2a5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 69d92bff-38df-455c-b731-a2864652e2a5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:39 compute-0 nova_compute[192903]: 2025-10-06 14:05:39.228 2 WARNING neutronclient.v2_0.client [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:39 compute-0 nova_compute[192903]: 2025-10-06 14:05:39.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:39 compute-0 nova_compute[192903]: 2025-10-06 14:05:39.733 2 INFO nova.network.neutron [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Updating port 61d9a52c-5658-4fb3-b375-73d56009fecb with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Oct 06 14:05:41 compute-0 nova_compute[192903]: 2025-10-06 14:05:41.375 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-e16d2f31-6d64-4d53-8f79-78ea4befde4a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:05:41 compute-0 nova_compute[192903]: 2025-10-06 14:05:41.375 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-e16d2f31-6d64-4d53-8f79-78ea4befde4a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:05:41 compute-0 nova_compute[192903]: 2025-10-06 14:05:41.376 2 DEBUG nova.network.neutron [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:05:41 compute-0 nova_compute[192903]: 2025-10-06 14:05:41.429 2 DEBUG nova.compute.manager [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-changed-61d9a52c-5658-4fb3-b375-73d56009fecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:05:41 compute-0 nova_compute[192903]: 2025-10-06 14:05:41.430 2 DEBUG nova.compute.manager [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Refreshing instance network info cache due to event network-changed-61d9a52c-5658-4fb3-b375-73d56009fecb. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:05:41 compute-0 nova_compute[192903]: 2025-10-06 14:05:41.430 2 DEBUG oslo_concurrency.lockutils [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-e16d2f31-6d64-4d53-8f79-78ea4befde4a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:05:41 compute-0 nova_compute[192903]: 2025-10-06 14:05:41.881 2 WARNING neutronclient.v2_0.client [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:43 compute-0 nova_compute[192903]: 2025-10-06 14:05:43.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:44 compute-0 ovn_controller[95205]: 2025-10-06T14:05:44Z|00065|binding|INFO|Claiming lport 0d38f548-fe64-428b-beab-0b96200911a7 for this chassis.
Oct 06 14:05:44 compute-0 ovn_controller[95205]: 2025-10-06T14:05:44Z|00066|binding|INFO|0d38f548-fe64-428b-beab-0b96200911a7: Claiming fa:16:3e:ad:00:49 10.100.0.5
Oct 06 14:05:44 compute-0 ovn_controller[95205]: 2025-10-06T14:05:44Z|00067|binding|INFO|Setting lport 0d38f548-fe64-428b-beab-0b96200911a7 up in Southbound
Oct 06 14:05:44 compute-0 nova_compute[192903]: 2025-10-06 14:05:44.406 2 WARNING neutronclient.v2_0.client [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:44 compute-0 nova_compute[192903]: 2025-10-06 14:05:44.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:44 compute-0 nova_compute[192903]: 2025-10-06 14:05:44.693 2 DEBUG nova.network.neutron [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Updating instance_info_cache with network_info: [{"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.199 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-e16d2f31-6d64-4d53-8f79-78ea4befde4a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.202 2 DEBUG oslo_concurrency.lockutils [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-e16d2f31-6d64-4d53-8f79-78ea4befde4a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.202 2 DEBUG nova.network.neutron [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Refreshing network info cache for port 61d9a52c-5658-4fb3-b375-73d56009fecb _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:05:45 compute-0 podman[218359]: 2025-10-06 14:05:45.207178176 +0000 UTC m=+0.065690832 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.477 2 INFO nova.compute.manager [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Post operation of migration started
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.478 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.709 2 WARNING neutronclient.v2_0.client [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.748 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.749 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.750 2 INFO nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Creating image(s)
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.750 2 DEBUG nova.objects.instance [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid e16d2f31-6d64-4d53-8f79-78ea4befde4a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.758 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.758 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.934 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-9453c63e-8e53-4d5f-9571-c0dfe2365ef9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.935 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-9453c63e-8e53-4d5f-9571-c0dfe2365ef9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:05:45 compute-0 nova_compute[192903]: 2025-10-06 14:05:45.935 2 DEBUG nova.network.neutron [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.257 2 DEBUG oslo_concurrency.processutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.342 2 DEBUG oslo_concurrency.processutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.343 2 DEBUG nova.virt.disk.api [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.344 2 DEBUG oslo_concurrency.processutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.408 2 DEBUG oslo_concurrency.processutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.409 2 DEBUG nova.virt.disk.api [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.415 2 WARNING neutronclient.v2_0.client [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.439 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.579 2 DEBUG nova.network.neutron [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Updated VIF entry in instance network info cache for port 61d9a52c-5658-4fb3-b375-73d56009fecb. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.579 2 DEBUG nova.network.neutron [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Updating instance_info_cache with network_info: [{"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.785 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.920 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.921 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Ensure instance console log exists: /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.921 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.922 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.922 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.925 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Start _get_guest_xml network_info=[{"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "vif_mac": "fa:16:3e:53:9d:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.930 2 WARNING nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.933 2 DEBUG nova.virt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-558711417', uuid='e16d2f31-6d64-4d53-8f79-78ea4befde4a'), owner=OwnerMeta(userid='4beaed30a2ec47bb9b5f6adb81ede0f7', username='tempest-TestExecuteActionsViaActuator-1260248176-project-admin', projectid='20952eb66a9c4fd2905273fb8f800689', projectname='tempest-TestExecuteActionsViaActuator-1260248176'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "vif_mac": "fa:16:3e:53:9d:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759759546.9327967) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.935 2 DEBUG nova.network.neutron [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Updating instance_info_cache with network_info: [{"id": "0d38f548-fe64-428b-beab-0b96200911a7", "address": "fa:16:3e:ad:00:49", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d38f548-fe", "ovs_interfaceid": "0d38f548-fe64-428b-beab-0b96200911a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.941 2 DEBUG nova.virt.libvirt.host [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.942 2 DEBUG nova.virt.libvirt.host [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.951 2 DEBUG nova.virt.libvirt.host [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.951 2 DEBUG nova.virt.libvirt.host [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.952 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.953 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.954 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.954 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.955 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.955 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.955 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.956 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.956 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.957 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.957 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.957 2 DEBUG nova.virt.hardware [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:05:46 compute-0 nova_compute[192903]: 2025-10-06 14:05:46.958 2 DEBUG nova.objects.instance [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'vcpu_model' on Instance uuid e16d2f31-6d64-4d53-8f79-78ea4befde4a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.086 2 DEBUG oslo_concurrency.lockutils [req-89aab0cc-34f2-4f36-9598-ea772c78c22d req-818a5ada-2d5f-4fae-a50d-b47a3f7ef06b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-e16d2f31-6d64-4d53-8f79-78ea4befde4a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:05:47 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Oct 06 14:05:47 compute-0 systemd[218233]: Activating special unit Exit the Session...
Oct 06 14:05:47 compute-0 systemd[218233]: Stopped target Main User Target.
Oct 06 14:05:47 compute-0 systemd[218233]: Stopped target Basic System.
Oct 06 14:05:47 compute-0 systemd[218233]: Stopped target Paths.
Oct 06 14:05:47 compute-0 systemd[218233]: Stopped target Sockets.
Oct 06 14:05:47 compute-0 systemd[218233]: Stopped target Timers.
Oct 06 14:05:47 compute-0 systemd[218233]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 06 14:05:47 compute-0 systemd[218233]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 06 14:05:47 compute-0 systemd[218233]: Closed D-Bus User Message Bus Socket.
Oct 06 14:05:47 compute-0 systemd[218233]: Stopped Create User's Volatile Files and Directories.
Oct 06 14:05:47 compute-0 systemd[218233]: Removed slice User Application Slice.
Oct 06 14:05:47 compute-0 systemd[218233]: Reached target Shutdown.
Oct 06 14:05:47 compute-0 systemd[218233]: Finished Exit the Session.
Oct 06 14:05:47 compute-0 systemd[218233]: Reached target Exit the Session.
Oct 06 14:05:47 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Oct 06 14:05:47 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Oct 06 14:05:47 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct 06 14:05:47 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct 06 14:05:47 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct 06 14:05:47 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct 06 14:05:47 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.446 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-9453c63e-8e53-4d5f-9571-c0dfe2365ef9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.470 2 DEBUG nova.objects.base [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<e16d2f31-6d64-4d53-8f79-78ea4befde4a> lazy-loaded attributes: trusted_certs,vcpu_model wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.472 2 DEBUG oslo_concurrency.processutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.523 2 DEBUG oslo_concurrency.processutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk.config --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.524 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "/var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.525 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "/var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.526 2 DEBUG oslo_concurrency.lockutils [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "/var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.528 2 DEBUG nova.virt.libvirt.vif [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:04:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-558711417',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-558711417',id=8,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:04:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-rcf1loyo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:05:37Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=e16d2f31-6d64-4d53-8f79-78ea4befde4a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "vif_mac": "fa:16:3e:53:9d:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.529 2 DEBUG nova.network.os_vif_util [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "vif_mac": "fa:16:3e:53:9d:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.530 2 DEBUG nova.network.os_vif_util [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:42,bridge_name='br-int',has_traffic_filtering=True,id=61d9a52c-5658-4fb3-b375-73d56009fecb,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d9a52c-56') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.534 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <uuid>e16d2f31-6d64-4d53-8f79-78ea4befde4a</uuid>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <name>instance-00000008</name>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-558711417</nova:name>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:05:46</nova:creationTime>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:property name="hw_input_bus">usb</nova:property>
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:property name="hw_machine_type">q35</nova:property>
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:property name="hw_video_model">virtio</nova:property>
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:property name="hw_vif_model">virtio</nova:property>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:user uuid="4beaed30a2ec47bb9b5f6adb81ede0f7">tempest-TestExecuteActionsViaActuator-1260248176-project-admin</nova:user>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:project uuid="20952eb66a9c4fd2905273fb8f800689">tempest-TestExecuteActionsViaActuator-1260248176</nova:project>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         <nova:port uuid="61d9a52c-5658-4fb3-b375-73d56009fecb">
Oct 06 14:05:47 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <system>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <entry name="serial">e16d2f31-6d64-4d53-8f79-78ea4befde4a</entry>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <entry name="uuid">e16d2f31-6d64-4d53-8f79-78ea4befde4a</entry>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     </system>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <os>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   </os>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <features>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   </features>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk.config"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:53:9d:42"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <target dev="tap61d9a52c-56"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/console.log" append="off"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <video>
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     </video>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:05:47 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:05:47 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:05:47 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:05:47 compute-0 nova_compute[192903]: </domain>
Oct 06 14:05:47 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.535 2 DEBUG nova.virt.libvirt.vif [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:04:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-558711417',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-558711417',id=8,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:04:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-rcf1loyo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:05:37Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=e16d2f31-6d64-4d53-8f79-78ea4befde4a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "vif_mac": "fa:16:3e:53:9d:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.536 2 DEBUG nova.network.os_vif_util [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "vif_mac": "fa:16:3e:53:9d:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.536 2 DEBUG nova.network.os_vif_util [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:42,bridge_name='br-int',has_traffic_filtering=True,id=61d9a52c-5658-4fb3-b375-73d56009fecb,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d9a52c-56') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.537 2 DEBUG os_vif [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:42,bridge_name='br-int',has_traffic_filtering=True,id=61d9a52c-5658-4fb3-b375-73d56009fecb,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d9a52c-56') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.539 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.540 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.541 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'bf3cac61-6fcb-5a2d-8acc-39a0b7ee764a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61d9a52c-56, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap61d9a52c-56, col_values=(('qos', UUID('bc7e1285-f414-4e56-ad35-0ae544238f3a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap61d9a52c-56, col_values=(('external_ids', {'iface-id': '61d9a52c-5658-4fb3-b375-73d56009fecb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:9d:42', 'vm-uuid': 'e16d2f31-6d64-4d53-8f79-78ea4befde4a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:47 compute-0 NetworkManager[52035]: <info>  [1759759547.5509] manager: (tap61d9a52c-56): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.557 2 INFO os_vif [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:9d:42,bridge_name='br-int',has_traffic_filtering=True,id=61d9a52c-5658-4fb3-b375-73d56009fecb,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d9a52c-56')
Oct 06 14:05:47 compute-0 podman[218398]: 2025-10-06 14:05:47.669115696 +0000 UTC m=+0.066417312 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 06 14:05:47 compute-0 podman[218397]: 2025-10-06 14:05:47.67406048 +0000 UTC m=+0.072954149 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:05:47 compute-0 podman[218396]: 2025-10-06 14:05:47.688189623 +0000 UTC m=+0.094442122 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.985 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.985 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.985 2 DEBUG oslo_concurrency.lockutils [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:47 compute-0 nova_compute[192903]: 2025-10-06 14:05:47.992 2 INFO nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:05:47 compute-0 virtqemud[192802]: Domain id=5 name='instance-00000006' uuid=9453c63e-8e53-4d5f-9571-c0dfe2365ef9 is tainted: custom-monitor
Oct 06 14:05:48 compute-0 nova_compute[192903]: 2025-10-06 14:05:48.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.001 2 INFO nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.143 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.143 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.143 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No VIF found with MAC fa:16:3e:53:9d:42, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.144 2 INFO nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Using config drive
Oct 06 14:05:49 compute-0 kernel: tap61d9a52c-56: entered promiscuous mode
Oct 06 14:05:49 compute-0 NetworkManager[52035]: <info>  [1759759549.2151] manager: (tap61d9a52c-56): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Oct 06 14:05:49 compute-0 ovn_controller[95205]: 2025-10-06T14:05:49Z|00068|binding|INFO|Claiming lport 61d9a52c-5658-4fb3-b375-73d56009fecb for this chassis.
Oct 06 14:05:49 compute-0 ovn_controller[95205]: 2025-10-06T14:05:49Z|00069|binding|INFO|61d9a52c-5658-4fb3-b375-73d56009fecb: Claiming fa:16:3e:53:9d:42 10.100.0.14
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.226 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:9d:42 10.100.0.14'], port_security=['fa:16:3e:53:9d:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e16d2f31-6d64-4d53-8f79-78ea4befde4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=61d9a52c-5658-4fb3-b375-73d56009fecb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.227 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 61d9a52c-5658-4fb3-b375-73d56009fecb in datapath 69d92bff-38df-455c-b731-a2864652e2a5 bound to our chassis
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.229 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:05:49 compute-0 ovn_controller[95205]: 2025-10-06T14:05:49Z|00070|binding|INFO|Setting lport 61d9a52c-5658-4fb3-b375-73d56009fecb ovn-installed in OVS
Oct 06 14:05:49 compute-0 ovn_controller[95205]: 2025-10-06T14:05:49Z|00071|binding|INFO|Setting lport 61d9a52c-5658-4fb3-b375-73d56009fecb up in Southbound
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.248 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4febfe-dc73-4043-b8b7-37bf9b855680]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:49 compute-0 systemd-machined[152985]: New machine qemu-6-instance-00000008.
Oct 06 14:05:49 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.277 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[c4d24cb6-63a5-41ab-99d5-9c25531b85a2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.280 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[8ec9ed3c-61e5-4cc2-9d6a-f1ac557c1b3e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:49 compute-0 systemd-udevd[218477]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:05:49 compute-0 NetworkManager[52035]: <info>  [1759759549.2973] device (tap61d9a52c-56): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:05:49 compute-0 NetworkManager[52035]: <info>  [1759759549.2986] device (tap61d9a52c-56): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.317 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[7d32dc76-c52e-401e-980b-482f7fe347ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.335 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[22460077-e232-4a48-a0cd-18bdc336706d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 27, 'tx_packets': 11, 'rx_bytes': 1630, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 27, 'tx_packets': 11, 'rx_bytes': 1630, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 31438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218488, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.349 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[dadd140b-a4f3-4bde-8ebd-23620939a338]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384410, 'tstamp': 384410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218489, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384414, 'tstamp': 384414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218489, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.350 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.354 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69d92bff-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.354 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.354 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69d92bff-30, col_values=(('external_ids', {'iface-id': '4cb572c5-2fe1-4cc2-9aac-d044653b4542'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.354 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.355 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a556c6bc-994c-4ef4-9412-3d62e9c70350]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-69d92bff-38df-455c-b731-a2864652e2a5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 69d92bff-38df-455c-b731-a2864652e2a5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.743 2 DEBUG nova.compute.manager [req-ce756fbf-7a78-4ce0-a6d7-1ca400d78887 req-3f8f351a-19c0-4e88-beb1-5f3d1ce618bd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-vif-plugged-61d9a52c-5658-4fb3-b375-73d56009fecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.744 2 DEBUG oslo_concurrency.lockutils [req-ce756fbf-7a78-4ce0-a6d7-1ca400d78887 req-3f8f351a-19c0-4e88-beb1-5f3d1ce618bd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.744 2 DEBUG oslo_concurrency.lockutils [req-ce756fbf-7a78-4ce0-a6d7-1ca400d78887 req-3f8f351a-19c0-4e88-beb1-5f3d1ce618bd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.744 2 DEBUG oslo_concurrency.lockutils [req-ce756fbf-7a78-4ce0-a6d7-1ca400d78887 req-3f8f351a-19c0-4e88-beb1-5f3d1ce618bd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.744 2 DEBUG nova.compute.manager [req-ce756fbf-7a78-4ce0-a6d7-1ca400d78887 req-3f8f351a-19c0-4e88-beb1-5f3d1ce618bd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] No waiting events found dispatching network-vif-plugged-61d9a52c-5658-4fb3-b375-73d56009fecb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.744 2 WARNING nova.compute.manager [req-ce756fbf-7a78-4ce0-a6d7-1ca400d78887 req-3f8f351a-19c0-4e88-beb1-5f3d1ce618bd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received unexpected event network-vif-plugged-61d9a52c-5658-4fb3-b375-73d56009fecb for instance with vm_state active and task_state resize_finish.
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.815 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:05:49 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:49.816 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:05:49 compute-0 nova_compute[192903]: 2025-10-06 14:05:49.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.008 2 INFO nova.virt.libvirt.driver [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.013 2 DEBUG nova.compute.manager [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.092 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.093 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.171 2 DEBUG nova.compute.manager [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.175 2 INFO nova.virt.libvirt.driver [-] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Instance running successfully.
Oct 06 14:05:50 compute-0 virtqemud[192802]: argument unsupported: QEMU guest agent is not configured
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.178 2 DEBUG nova.virt.libvirt.guest [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.178 2 DEBUG nova.virt.libvirt.driver [None req-4d63403b-0550-4166-b207-d55f1b2f1997 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Oct 06 14:05:50 compute-0 nova_compute[192903]: 2025-10-06 14:05:50.524 2 DEBUG nova.objects.instance [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.152 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.232 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.232 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.301 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.308 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.384 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.385 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.462 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.472 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.546 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.559 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.560 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.649 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.657 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.727 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.728 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.786 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.787 2 WARNING neutronclient.v2_0.client [None req-ed32471c-e77d-451d-9ebc-6f308acc3171 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.802 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.806 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.856 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.856 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.920 2 DEBUG nova.compute.manager [req-2d811920-b6d6-4829-a654-8a9dd2567034 req-2848e647-7af8-4122-9ae0-9560b1769f6c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-vif-plugged-61d9a52c-5658-4fb3-b375-73d56009fecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.921 2 DEBUG oslo_concurrency.lockutils [req-2d811920-b6d6-4829-a654-8a9dd2567034 req-2848e647-7af8-4122-9ae0-9560b1769f6c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.922 2 DEBUG oslo_concurrency.lockutils [req-2d811920-b6d6-4829-a654-8a9dd2567034 req-2848e647-7af8-4122-9ae0-9560b1769f6c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.922 2 DEBUG oslo_concurrency.lockutils [req-2d811920-b6d6-4829-a654-8a9dd2567034 req-2848e647-7af8-4122-9ae0-9560b1769f6c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.923 2 DEBUG nova.compute.manager [req-2d811920-b6d6-4829-a654-8a9dd2567034 req-2848e647-7af8-4122-9ae0-9560b1769f6c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] No waiting events found dispatching network-vif-plugged-61d9a52c-5658-4fb3-b375-73d56009fecb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.923 2 WARNING nova.compute.manager [req-2d811920-b6d6-4829-a654-8a9dd2567034 req-2848e647-7af8-4122-9ae0-9560b1769f6c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received unexpected event network-vif-plugged-61d9a52c-5658-4fb3-b375-73d56009fecb for instance with vm_state resized and task_state None.
Oct 06 14:05:51 compute-0 nova_compute[192903]: 2025-10-06 14:05:51.941 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:52 compute-0 nova_compute[192903]: 2025-10-06 14:05:52.170 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:05:52 compute-0 nova_compute[192903]: 2025-10-06 14:05:52.171 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:05:52 compute-0 nova_compute[192903]: 2025-10-06 14:05:52.212 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:05:52 compute-0 nova_compute[192903]: 2025-10-06 14:05:52.212 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5171MB free_disk=73.16145324707031GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:05:52 compute-0 nova_compute[192903]: 2025-10-06 14:05:52.213 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:05:52 compute-0 nova_compute[192903]: 2025-10-06 14:05:52.213 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:05:52 compute-0 nova_compute[192903]: 2025-10-06 14:05:52.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.243 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Applying migration context for instance 9453c63e-8e53-4d5f-9571-c0dfe2365ef9 as it has an incoming, in-progress migration 944c80fa-28e5-4548-b930-d5ba7c447a44. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.243 2 DEBUG nova.objects.instance [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.244 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Applying migration context for instance e16d2f31-6d64-4d53-8f79-78ea4befde4a as it has an incoming, in-progress migration dee5880d-3d5b-4f77-ae9c-7a0bec90ba44. Migration status is finished _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.753 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.753 2 INFO nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Updating resource usage from migration dee5880d-3d5b-4f77-ae9c-7a0bec90ba44
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.794 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 46246aa4-aa4f-4a8e-93ba-5fc685a531a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.794 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance ea9b1b2c-e123-4a8b-a2ef-f29e14732d20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.794 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance a8449b2e-50c6-45a4-b201-210240c50968 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.794 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 9453c63e-8e53-4d5f-9571-c0dfe2365ef9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.794 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance e16d2f31-6d64-4d53-8f79-78ea4befde4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.794 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:05:53 compute-0 nova_compute[192903]: 2025-10-06 14:05:53.795 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1152MB phys_disk=79GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:05:52 up  1:06,  0 user,  load average: 0.87, 0.47, 0.43\n', 'num_instances': '5', 'num_vm_active': '4', 'num_task_None': '5', 'num_os_type_None': '5', 'num_proj_20952eb66a9c4fd2905273fb8f800689': '5', 'io_workload': '0', 'num_vm_resized': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:05:54 compute-0 nova_compute[192903]: 2025-10-06 14:05:54.091 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:05:54 compute-0 nova_compute[192903]: 2025-10-06 14:05:54.597 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:05:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:05:54.823 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:05:55 compute-0 nova_compute[192903]: 2025-10-06 14:05:55.106 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:05:55 compute-0 nova_compute[192903]: 2025-10-06 14:05:55.107 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.894s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:05:57 compute-0 nova_compute[192903]: 2025-10-06 14:05:57.108 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:05:57 compute-0 nova_compute[192903]: 2025-10-06 14:05:57.108 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:05:57 compute-0 nova_compute[192903]: 2025-10-06 14:05:57.109 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:05:57 compute-0 podman[218531]: 2025-10-06 14:05:57.212425016 +0000 UTC m=+0.073098273 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 14:05:57 compute-0 nova_compute[192903]: 2025-10-06 14:05:57.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:58 compute-0 nova_compute[192903]: 2025-10-06 14:05:58.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:05:59 compute-0 podman[203308]: time="2025-10-06T14:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:05:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:05:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3466 "" "Go-http-client/1.1"
Oct 06 14:06:00 compute-0 podman[218553]: 2025-10-06 14:06:00.240013467 +0000 UTC m=+0.100553578 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 06 14:06:01 compute-0 openstack_network_exporter[205500]: ERROR   14:06:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:06:01 compute-0 openstack_network_exporter[205500]: ERROR   14:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:06:01 compute-0 openstack_network_exporter[205500]: ERROR   14:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:06:01 compute-0 openstack_network_exporter[205500]: ERROR   14:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:06:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:06:01 compute-0 openstack_network_exporter[205500]: ERROR   14:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:06:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:06:01 compute-0 ovn_controller[95205]: 2025-10-06T14:06:01Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:9d:42 10.100.0.14
Oct 06 14:06:02 compute-0 nova_compute[192903]: 2025-10-06 14:06:02.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:03 compute-0 nova_compute[192903]: 2025-10-06 14:06:03.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:06 compute-0 nova_compute[192903]: 2025-10-06 14:06:06.763 2 DEBUG oslo_concurrency.lockutils [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "a8449b2e-50c6-45a4-b201-210240c50968" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:06 compute-0 nova_compute[192903]: 2025-10-06 14:06:06.764 2 DEBUG oslo_concurrency.lockutils [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:06 compute-0 nova_compute[192903]: 2025-10-06 14:06:06.764 2 DEBUG oslo_concurrency.lockutils [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "a8449b2e-50c6-45a4-b201-210240c50968-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:06 compute-0 nova_compute[192903]: 2025-10-06 14:06:06.764 2 DEBUG oslo_concurrency.lockutils [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:06 compute-0 nova_compute[192903]: 2025-10-06 14:06:06.765 2 DEBUG oslo_concurrency.lockutils [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:06 compute-0 nova_compute[192903]: 2025-10-06 14:06:06.775 2 INFO nova.compute.manager [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Terminating instance
Oct 06 14:06:07 compute-0 nova_compute[192903]: 2025-10-06 14:06:07.293 2 DEBUG nova.compute.manager [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:06:07 compute-0 kernel: tap1a60ab0b-06 (unregistering): left promiscuous mode
Oct 06 14:06:07 compute-0 NetworkManager[52035]: <info>  [1759759567.3299] device (tap1a60ab0b-06): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:06:07 compute-0 nova_compute[192903]: 2025-10-06 14:06:07.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:07 compute-0 ovn_controller[95205]: 2025-10-06T14:06:07Z|00072|binding|INFO|Releasing lport 1a60ab0b-06f0-436a-a116-c1d328ad3203 from this chassis (sb_readonly=0)
Oct 06 14:06:07 compute-0 ovn_controller[95205]: 2025-10-06T14:06:07Z|00073|binding|INFO|Setting lport 1a60ab0b-06f0-436a-a116-c1d328ad3203 down in Southbound
Oct 06 14:06:07 compute-0 ovn_controller[95205]: 2025-10-06T14:06:07Z|00074|binding|INFO|Removing iface tap1a60ab0b-06 ovn-installed in OVS
Oct 06 14:06:07 compute-0 nova_compute[192903]: 2025-10-06 14:06:07.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.351 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:4d:4d 10.100.0.12'], port_security=['fa:16:3e:f1:4d:4d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a8449b2e-50c6-45a4-b201-210240c50968', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=1a60ab0b-06f0-436a-a116-c1d328ad3203) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.353 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 1a60ab0b-06f0-436a-a116-c1d328ad3203 in datapath 69d92bff-38df-455c-b731-a2864652e2a5 unbound from our chassis
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.354 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:06:07 compute-0 nova_compute[192903]: 2025-10-06 14:06:07.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:07 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct 06 14:06:07 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 14.162s CPU time.
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.379 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[468e36ad-f75b-4c54-86f6-670985258808]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:07 compute-0 systemd-machined[152985]: Machine qemu-4-instance-00000009 terminated.
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.415 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[6e493ed3-84a4-4bda-a962-f50fd73a0d8e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.420 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[30707b7a-6710-4ffd-96d6-4eafcf1ea23d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.455 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[fe322305-d96f-449a-a553-2eecd8ebcef3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.478 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[186dbb80-6ef4-4ba5-9d3d-d29614ff0079]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 35, 'tx_packets': 13, 'rx_bytes': 1966, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 35, 'tx_packets': 13, 'rx_bytes': 1966, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 31438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218600, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.500 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8baf4051-471f-4fd0-93aa-91a88cb7e0bc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384410, 'tstamp': 384410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218601, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384414, 'tstamp': 384414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218601, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.502 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:07 compute-0 nova_compute[192903]: 2025-10-06 14:06:07.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:07 compute-0 nova_compute[192903]: 2025-10-06 14:06:07.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.512 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69d92bff-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.512 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.513 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69d92bff-30, col_values=(('external_ids', {'iface-id': '4cb572c5-2fe1-4cc2-9aac-d044653b4542'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.513 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:06:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:07.514 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ed00cfeb-cb68-4230-a3ec-cb70508f6fb2]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-69d92bff-38df-455c-b731-a2864652e2a5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 69d92bff-38df-455c-b731-a2864652e2a5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:07 compute-0 nova_compute[192903]: 2025-10-06 14:06:07.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:07 compute-0 nova_compute[192903]: 2025-10-06 14:06:07.570 2 INFO nova.virt.libvirt.driver [-] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Instance destroyed successfully.
Oct 06 14:06:07 compute-0 nova_compute[192903]: 2025-10-06 14:06:07.571 2 DEBUG nova.objects.instance [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lazy-loading 'resources' on Instance uuid a8449b2e-50c6-45a4-b201-210240c50968 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.083 2 DEBUG nova.virt.libvirt.vif [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:04:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1232667021',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1232667021',id=9,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:05:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-ofptvqyy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:05:09Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=a8449b2e-50c6-45a4-b201-210240c50968,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.083 2 DEBUG nova.network.os_vif_util [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "address": "fa:16:3e:f1:4d:4d", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a60ab0b-06", "ovs_interfaceid": "1a60ab0b-06f0-436a-a116-c1d328ad3203", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.083 2 DEBUG nova.network.os_vif_util [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:4d:4d,bridge_name='br-int',has_traffic_filtering=True,id=1a60ab0b-06f0-436a-a116-c1d328ad3203,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a60ab0b-06') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.084 2 DEBUG os_vif [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:4d:4d,bridge_name='br-int',has_traffic_filtering=True,id=1a60ab0b-06f0-436a-a116-c1d328ad3203,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a60ab0b-06') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.085 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a60ab0b-06, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.090 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e6c1766c-d5bd-4d76-8f0c-f4dab51cb4d0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.094 2 INFO os_vif [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:4d:4d,bridge_name='br-int',has_traffic_filtering=True,id=1a60ab0b-06f0-436a-a116-c1d328ad3203,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a60ab0b-06')
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.094 2 INFO nova.virt.libvirt.driver [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Deleting instance files /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968_del
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.095 2 INFO nova.virt.libvirt.driver [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Deletion of /var/lib/nova/instances/a8449b2e-50c6-45a4-b201-210240c50968_del complete
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.613 2 INFO nova.compute.manager [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.614 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.614 2 DEBUG nova.compute.manager [-] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.614 2 DEBUG nova.network.neutron [-] [instance: a8449b2e-50c6-45a4-b201-210240c50968] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:06:08 compute-0 nova_compute[192903]: 2025-10-06 14:06:08.615 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:09 compute-0 nova_compute[192903]: 2025-10-06 14:06:09.730 2 DEBUG nova.compute.manager [req-858daed1-9c8e-4d54-b50e-6c177e37aecc req-fab737f9-650e-45e2-9286-4a1ff38ded4d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Received event network-vif-unplugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:09 compute-0 nova_compute[192903]: 2025-10-06 14:06:09.731 2 DEBUG oslo_concurrency.lockutils [req-858daed1-9c8e-4d54-b50e-6c177e37aecc req-fab737f9-650e-45e2-9286-4a1ff38ded4d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "a8449b2e-50c6-45a4-b201-210240c50968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:09 compute-0 nova_compute[192903]: 2025-10-06 14:06:09.731 2 DEBUG oslo_concurrency.lockutils [req-858daed1-9c8e-4d54-b50e-6c177e37aecc req-fab737f9-650e-45e2-9286-4a1ff38ded4d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:09 compute-0 nova_compute[192903]: 2025-10-06 14:06:09.731 2 DEBUG oslo_concurrency.lockutils [req-858daed1-9c8e-4d54-b50e-6c177e37aecc req-fab737f9-650e-45e2-9286-4a1ff38ded4d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:09 compute-0 nova_compute[192903]: 2025-10-06 14:06:09.732 2 DEBUG nova.compute.manager [req-858daed1-9c8e-4d54-b50e-6c177e37aecc req-fab737f9-650e-45e2-9286-4a1ff38ded4d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] No waiting events found dispatching network-vif-unplugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:09 compute-0 nova_compute[192903]: 2025-10-06 14:06:09.732 2 DEBUG nova.compute.manager [req-858daed1-9c8e-4d54-b50e-6c177e37aecc req-fab737f9-650e-45e2-9286-4a1ff38ded4d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Received event network-vif-unplugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:06:10 compute-0 nova_compute[192903]: 2025-10-06 14:06:10.123 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:11 compute-0 nova_compute[192903]: 2025-10-06 14:06:11.107 2 DEBUG nova.network.neutron [-] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:06:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:11.358 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:11.358 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:11.360 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:11 compute-0 nova_compute[192903]: 2025-10-06 14:06:11.615 2 INFO nova.compute.manager [-] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Took 3.00 seconds to deallocate network for instance.
Oct 06 14:06:11 compute-0 nova_compute[192903]: 2025-10-06 14:06:11.868 2 DEBUG nova.compute.manager [req-4b29a0c9-563c-4b4a-a1d1-05fb8cc5ea0c req-22a59dbb-ec58-46d9-9d16-4a87595bd14d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Received event network-vif-unplugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:11 compute-0 nova_compute[192903]: 2025-10-06 14:06:11.869 2 DEBUG oslo_concurrency.lockutils [req-4b29a0c9-563c-4b4a-a1d1-05fb8cc5ea0c req-22a59dbb-ec58-46d9-9d16-4a87595bd14d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "a8449b2e-50c6-45a4-b201-210240c50968-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:11 compute-0 nova_compute[192903]: 2025-10-06 14:06:11.869 2 DEBUG oslo_concurrency.lockutils [req-4b29a0c9-563c-4b4a-a1d1-05fb8cc5ea0c req-22a59dbb-ec58-46d9-9d16-4a87595bd14d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:11 compute-0 nova_compute[192903]: 2025-10-06 14:06:11.870 2 DEBUG oslo_concurrency.lockutils [req-4b29a0c9-563c-4b4a-a1d1-05fb8cc5ea0c req-22a59dbb-ec58-46d9-9d16-4a87595bd14d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:11 compute-0 nova_compute[192903]: 2025-10-06 14:06:11.870 2 DEBUG nova.compute.manager [req-4b29a0c9-563c-4b4a-a1d1-05fb8cc5ea0c req-22a59dbb-ec58-46d9-9d16-4a87595bd14d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] No waiting events found dispatching network-vif-unplugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:11 compute-0 nova_compute[192903]: 2025-10-06 14:06:11.870 2 WARNING nova.compute.manager [req-4b29a0c9-563c-4b4a-a1d1-05fb8cc5ea0c req-22a59dbb-ec58-46d9-9d16-4a87595bd14d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Received unexpected event network-vif-unplugged-1a60ab0b-06f0-436a-a116-c1d328ad3203 for instance with vm_state deleted and task_state None.
Oct 06 14:06:11 compute-0 nova_compute[192903]: 2025-10-06 14:06:11.871 2 DEBUG nova.compute.manager [req-4b29a0c9-563c-4b4a-a1d1-05fb8cc5ea0c req-22a59dbb-ec58-46d9-9d16-4a87595bd14d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a8449b2e-50c6-45a4-b201-210240c50968] Received event network-vif-deleted-1a60ab0b-06f0-436a-a116-c1d328ad3203 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:12 compute-0 nova_compute[192903]: 2025-10-06 14:06:12.140 2 DEBUG oslo_concurrency.lockutils [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:12 compute-0 nova_compute[192903]: 2025-10-06 14:06:12.140 2 DEBUG oslo_concurrency.lockutils [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:12 compute-0 nova_compute[192903]: 2025-10-06 14:06:12.279 2 DEBUG nova.compute.provider_tree [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:06:12 compute-0 nova_compute[192903]: 2025-10-06 14:06:12.787 2 DEBUG nova.scheduler.client.report [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:06:13 compute-0 nova_compute[192903]: 2025-10-06 14:06:13.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:13 compute-0 nova_compute[192903]: 2025-10-06 14:06:13.297 2 DEBUG oslo_concurrency.lockutils [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.156s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:13 compute-0 nova_compute[192903]: 2025-10-06 14:06:13.326 2 INFO nova.scheduler.client.report [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Deleted allocations for instance a8449b2e-50c6-45a4-b201-210240c50968
Oct 06 14:06:13 compute-0 nova_compute[192903]: 2025-10-06 14:06:13.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:14 compute-0 nova_compute[192903]: 2025-10-06 14:06:14.368 2 DEBUG oslo_concurrency.lockutils [None req-e441044e-aea8-4261-b3ec-80618e09d8d5 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "a8449b2e-50c6-45a4-b201-210240c50968" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.604s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.085 2 DEBUG oslo_concurrency.lockutils [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.085 2 DEBUG oslo_concurrency.lockutils [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.086 2 DEBUG oslo_concurrency.lockutils [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.086 2 DEBUG oslo_concurrency.lockutils [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.087 2 DEBUG oslo_concurrency.lockutils [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.101 2 INFO nova.compute.manager [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Terminating instance
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.618 2 DEBUG nova.compute.manager [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:06:15 compute-0 kernel: tap61d9a52c-56 (unregistering): left promiscuous mode
Oct 06 14:06:15 compute-0 NetworkManager[52035]: <info>  [1759759575.6484] device (tap61d9a52c-56): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:06:15 compute-0 ovn_controller[95205]: 2025-10-06T14:06:15Z|00075|binding|INFO|Releasing lport 61d9a52c-5658-4fb3-b375-73d56009fecb from this chassis (sb_readonly=0)
Oct 06 14:06:15 compute-0 ovn_controller[95205]: 2025-10-06T14:06:15Z|00076|binding|INFO|Setting lport 61d9a52c-5658-4fb3-b375-73d56009fecb down in Southbound
Oct 06 14:06:15 compute-0 ovn_controller[95205]: 2025-10-06T14:06:15Z|00077|binding|INFO|Removing iface tap61d9a52c-56 ovn-installed in OVS
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.669 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:9d:42 10.100.0.14'], port_security=['fa:16:3e:53:9d:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e16d2f31-6d64-4d53-8f79-78ea4befde4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=61d9a52c-5658-4fb3-b375-73d56009fecb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.670 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 61d9a52c-5658-4fb3-b375-73d56009fecb in datapath 69d92bff-38df-455c-b731-a2864652e2a5 unbound from our chassis
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.672 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.688 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9297ff8b-9608-4bc3-9a2b-704564d1b253]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:15 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Oct 06 14:06:15 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 12.679s CPU time.
Oct 06 14:06:15 compute-0 systemd-machined[152985]: Machine qemu-6-instance-00000008 terminated.
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.723 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[c60141be-63bc-4978-aca0-ce7a1ea0d747]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.725 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[33534c7a-e6a6-449c-b739-9984a813f4ef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:15 compute-0 podman[218621]: 2025-10-06 14:06:15.745921718 +0000 UTC m=+0.067187143 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.758 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[41de2690-1ee9-4401-a892-6056c36d3d48]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.779 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1fff45-93e4-4569-9c7a-695eb0bf3573]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 15, 'rx_bytes': 2008, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 15, 'rx_bytes': 2008, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 31438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218656, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.796 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bb6e37f0-a676-4a89-96aa-a0024c6ddd77]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384410, 'tstamp': 384410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218657, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384414, 'tstamp': 384414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218657, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.797 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.804 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69d92bff-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.804 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.805 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69d92bff-30, col_values=(('external_ids', {'iface-id': '4cb572c5-2fe1-4cc2-9aac-d044653b4542'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.805 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.805 2 DEBUG nova.compute.manager [req-605ff1fc-7a26-4b81-b660-d37b93888a08 req-10d18741-512d-4d6c-bd7a-fbfd11c1e61c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.806 2 DEBUG oslo_concurrency.lockutils [req-605ff1fc-7a26-4b81-b660-d37b93888a08 req-10d18741-512d-4d6c-bd7a-fbfd11c1e61c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.806 2 DEBUG oslo_concurrency.lockutils [req-605ff1fc-7a26-4b81-b660-d37b93888a08 req-10d18741-512d-4d6c-bd7a-fbfd11c1e61c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.807 2 DEBUG oslo_concurrency.lockutils [req-605ff1fc-7a26-4b81-b660-d37b93888a08 req-10d18741-512d-4d6c-bd7a-fbfd11c1e61c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:15.806 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[80638c15-9d91-4622-b475-bf88c249d54e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-69d92bff-38df-455c-b731-a2864652e2a5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 69d92bff-38df-455c-b731-a2864652e2a5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.807 2 DEBUG nova.compute.manager [req-605ff1fc-7a26-4b81-b660-d37b93888a08 req-10d18741-512d-4d6c-bd7a-fbfd11c1e61c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] No waiting events found dispatching network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.807 2 DEBUG nova.compute.manager [req-605ff1fc-7a26-4b81-b660-d37b93888a08 req-10d18741-512d-4d6c-bd7a-fbfd11c1e61c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.878 2 INFO nova.virt.libvirt.driver [-] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Instance destroyed successfully.
Oct 06 14:06:15 compute-0 nova_compute[192903]: 2025-10-06 14:06:15.878 2 DEBUG nova.objects.instance [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lazy-loading 'resources' on Instance uuid e16d2f31-6d64-4d53-8f79-78ea4befde4a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.385 2 DEBUG nova.virt.libvirt.vif [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:04:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-558711417',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-558711417',id=8,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:05:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-rcf1loyo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:06:01Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=e16d2f31-6d64-4d53-8f79-78ea4befde4a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.386 2 DEBUG nova.network.os_vif_util [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "61d9a52c-5658-4fb3-b375-73d56009fecb", "address": "fa:16:3e:53:9d:42", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d9a52c-56", "ovs_interfaceid": "61d9a52c-5658-4fb3-b375-73d56009fecb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.388 2 DEBUG nova.network.os_vif_util [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:53:9d:42,bridge_name='br-int',has_traffic_filtering=True,id=61d9a52c-5658-4fb3-b375-73d56009fecb,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d9a52c-56') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.388 2 DEBUG os_vif [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:9d:42,bridge_name='br-int',has_traffic_filtering=True,id=61d9a52c-5658-4fb3-b375-73d56009fecb,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d9a52c-56') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.391 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61d9a52c-56, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.436 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=bc7e1285-f414-4e56-ad35-0ae544238f3a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.443 2 INFO os_vif [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:9d:42,bridge_name='br-int',has_traffic_filtering=True,id=61d9a52c-5658-4fb3-b375-73d56009fecb,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d9a52c-56')
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.444 2 INFO nova.virt.libvirt.driver [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Deleting instance files /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a_del
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.455 2 INFO nova.virt.libvirt.driver [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Deletion of /var/lib/nova/instances/e16d2f31-6d64-4d53-8f79-78ea4befde4a_del complete
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.968 2 INFO nova.compute.manager [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Took 1.35 seconds to destroy the instance on the hypervisor.
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.969 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.970 2 DEBUG nova.compute.manager [-] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.970 2 DEBUG nova.network.neutron [-] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:06:16 compute-0 nova_compute[192903]: 2025-10-06 14:06:16.970 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.468 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.877 2 DEBUG nova.compute.manager [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.878 2 DEBUG oslo_concurrency.lockutils [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.878 2 DEBUG oslo_concurrency.lockutils [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.879 2 DEBUG oslo_concurrency.lockutils [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.879 2 DEBUG nova.compute.manager [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] No waiting events found dispatching network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.880 2 DEBUG nova.compute.manager [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-vif-unplugged-61d9a52c-5658-4fb3-b375-73d56009fecb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.880 2 DEBUG nova.compute.manager [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Received event network-vif-deleted-61d9a52c-5658-4fb3-b375-73d56009fecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.881 2 INFO nova.compute.manager [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Neutron deleted interface 61d9a52c-5658-4fb3-b375-73d56009fecb; detaching it from the instance and deleting it from the info cache
Oct 06 14:06:17 compute-0 nova_compute[192903]: 2025-10-06 14:06:17.881 2 DEBUG nova.network.neutron [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:06:18 compute-0 podman[218678]: 2025-10-06 14:06:18.230617517 +0000 UTC m=+0.078520001 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 06 14:06:18 compute-0 podman[218677]: 2025-10-06 14:06:18.247786992 +0000 UTC m=+0.100224359 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 14:06:18 compute-0 nova_compute[192903]: 2025-10-06 14:06:18.253 2 DEBUG nova.network.neutron [-] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:06:18 compute-0 podman[218676]: 2025-10-06 14:06:18.311734447 +0000 UTC m=+0.165955542 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 06 14:06:18 compute-0 nova_compute[192903]: 2025-10-06 14:06:18.389 2 DEBUG nova.compute.manager [req-e2810cd2-3f17-4ed2-b0fd-b0bfcdcf4fd1 req-3eb301a3-abb2-4ff1-900c-b2d89dff1e21 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Detach interface failed, port_id=61d9a52c-5658-4fb3-b375-73d56009fecb, reason: Instance e16d2f31-6d64-4d53-8f79-78ea4befde4a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:06:18 compute-0 nova_compute[192903]: 2025-10-06 14:06:18.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:18 compute-0 nova_compute[192903]: 2025-10-06 14:06:18.761 2 INFO nova.compute.manager [-] [instance: e16d2f31-6d64-4d53-8f79-78ea4befde4a] Took 1.79 seconds to deallocate network for instance.
Oct 06 14:06:19 compute-0 nova_compute[192903]: 2025-10-06 14:06:19.284 2 DEBUG oslo_concurrency.lockutils [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:19 compute-0 nova_compute[192903]: 2025-10-06 14:06:19.285 2 DEBUG oslo_concurrency.lockutils [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:19 compute-0 nova_compute[192903]: 2025-10-06 14:06:19.399 2 DEBUG nova.compute.provider_tree [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:06:19 compute-0 nova_compute[192903]: 2025-10-06 14:06:19.908 2 DEBUG nova.scheduler.client.report [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:06:20 compute-0 nova_compute[192903]: 2025-10-06 14:06:20.421 2 DEBUG oslo_concurrency.lockutils [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.136s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:20 compute-0 nova_compute[192903]: 2025-10-06 14:06:20.446 2 INFO nova.scheduler.client.report [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Deleted allocations for instance e16d2f31-6d64-4d53-8f79-78ea4befde4a
Oct 06 14:06:21 compute-0 nova_compute[192903]: 2025-10-06 14:06:21.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:21 compute-0 nova_compute[192903]: 2025-10-06 14:06:21.477 2 DEBUG oslo_concurrency.lockutils [None req-23485e27-4c55-4bb2-81b6-65ec51ea4ac8 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "e16d2f31-6d64-4d53-8f79-78ea4befde4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.392s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:23 compute-0 nova_compute[192903]: 2025-10-06 14:06:23.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.094 2 DEBUG oslo_concurrency.lockutils [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.094 2 DEBUG oslo_concurrency.lockutils [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.095 2 DEBUG oslo_concurrency.lockutils [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.095 2 DEBUG oslo_concurrency.lockutils [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.095 2 DEBUG oslo_concurrency.lockutils [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.107 2 INFO nova.compute.manager [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Terminating instance
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.621 2 DEBUG nova.compute.manager [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:06:24 compute-0 kernel: tapb1f44cfa-c6 (unregistering): left promiscuous mode
Oct 06 14:06:24 compute-0 NetworkManager[52035]: <info>  [1759759584.6535] device (tapb1f44cfa-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:06:24 compute-0 ovn_controller[95205]: 2025-10-06T14:06:24Z|00078|binding|INFO|Releasing lport b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 from this chassis (sb_readonly=0)
Oct 06 14:06:24 compute-0 ovn_controller[95205]: 2025-10-06T14:06:24Z|00079|binding|INFO|Setting lport b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 down in Southbound
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:24 compute-0 ovn_controller[95205]: 2025-10-06T14:06:24Z|00080|binding|INFO|Removing iface tapb1f44cfa-c6 ovn-installed in OVS
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.678 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:4a:68 10.100.0.7'], port_security=['fa:16:3e:77:4a:68 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ea9b1b2c-e123-4a8b-a2ef-f29e14732d20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.679 104072 INFO neutron.agent.ovn.metadata.agent [-] Port b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 in datapath 69d92bff-38df-455c-b731-a2864652e2a5 unbound from our chassis
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.682 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.702 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[75fd04b8-9556-4959-a604-b069bd27e35b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:24 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Oct 06 14:06:24 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 18.479s CPU time.
Oct 06 14:06:24 compute-0 systemd-machined[152985]: Machine qemu-3-instance-00000007 terminated.
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.736 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[f092e7dd-dd60-46aa-893f-609b54b5e4f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.738 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[2a06be15-6647-4f6e-aed6-d17d8309cd7a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.781 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[50c0e7a9-b668-4fe9-a3dd-4a179482118e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.803 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1a087e37-f33e-4bcb-ad8c-eefba2fe121f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 2008, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 2008, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 31438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218751, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.831 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[36607ff0-1bd3-4e2a-935c-217c2ebe54de]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384410, 'tstamp': 384410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218752, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384414, 'tstamp': 384414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218752, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.832 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.841 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69d92bff-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.841 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.842 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69d92bff-30, col_values=(('external_ids', {'iface-id': '4cb572c5-2fe1-4cc2-9aac-d044653b4542'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.842 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:06:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:24.844 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4461b5ea-13be-422f-b8d2-acdb7bacb230]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-69d92bff-38df-455c-b731-a2864652e2a5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 69d92bff-38df-455c-b731-a2864652e2a5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.863 2 DEBUG nova.compute.manager [req-154e0415-77b6-4edd-b844-cd3cf988b5d8 req-2f178fd4-1fd5-4cd2-816c-8a29e6fda721 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Received event network-vif-unplugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.864 2 DEBUG oslo_concurrency.lockutils [req-154e0415-77b6-4edd-b844-cd3cf988b5d8 req-2f178fd4-1fd5-4cd2-816c-8a29e6fda721 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.864 2 DEBUG oslo_concurrency.lockutils [req-154e0415-77b6-4edd-b844-cd3cf988b5d8 req-2f178fd4-1fd5-4cd2-816c-8a29e6fda721 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.864 2 DEBUG oslo_concurrency.lockutils [req-154e0415-77b6-4edd-b844-cd3cf988b5d8 req-2f178fd4-1fd5-4cd2-816c-8a29e6fda721 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.864 2 DEBUG nova.compute.manager [req-154e0415-77b6-4edd-b844-cd3cf988b5d8 req-2f178fd4-1fd5-4cd2-816c-8a29e6fda721 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] No waiting events found dispatching network-vif-unplugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.864 2 DEBUG nova.compute.manager [req-154e0415-77b6-4edd-b844-cd3cf988b5d8 req-2f178fd4-1fd5-4cd2-816c-8a29e6fda721 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Received event network-vif-unplugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.897 2 INFO nova.virt.libvirt.driver [-] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Instance destroyed successfully.
Oct 06 14:06:24 compute-0 nova_compute[192903]: 2025-10-06 14:06:24.898 2 DEBUG nova.objects.instance [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lazy-loading 'resources' on Instance uuid ea9b1b2c-e123-4a8b-a2ef-f29e14732d20 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.403 2 DEBUG nova.virt.libvirt.vif [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:04:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2070851038',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2070851038',id=7,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:04:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-y3zwwbvy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:04:23Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=ea9b1b2c-e123-4a8b-a2ef-f29e14732d20,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.403 2 DEBUG nova.network.os_vif_util [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "address": "fa:16:3e:77:4a:68", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f44cfa-c6", "ovs_interfaceid": "b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.404 2 DEBUG nova.network.os_vif_util [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:4a:68,bridge_name='br-int',has_traffic_filtering=True,id=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f44cfa-c6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.404 2 DEBUG os_vif [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:4a:68,bridge_name='br-int',has_traffic_filtering=True,id=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f44cfa-c6') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.405 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1f44cfa-c6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.409 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e89a54ce-ac98-4fa8-b2c8-b06e613ac071) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.412 2 INFO os_vif [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:4a:68,bridge_name='br-int',has_traffic_filtering=True,id=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f44cfa-c6')
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.412 2 INFO nova.virt.libvirt.driver [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Deleting instance files /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20_del
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.413 2 INFO nova.virt.libvirt.driver [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Deletion of /var/lib/nova/instances/ea9b1b2c-e123-4a8b-a2ef-f29e14732d20_del complete
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.925 2 INFO nova.compute.manager [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Took 1.30 seconds to destroy the instance on the hypervisor.
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.926 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.927 2 DEBUG nova.compute.manager [-] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.927 2 DEBUG nova.network.neutron [-] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:06:25 compute-0 nova_compute[192903]: 2025-10-06 14:06:25.927 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.325 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.923 2 DEBUG nova.compute.manager [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Received event network-vif-unplugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.923 2 DEBUG oslo_concurrency.lockutils [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.924 2 DEBUG oslo_concurrency.lockutils [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.924 2 DEBUG oslo_concurrency.lockutils [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.924 2 DEBUG nova.compute.manager [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] No waiting events found dispatching network-vif-unplugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.925 2 DEBUG nova.compute.manager [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Received event network-vif-unplugged-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.925 2 DEBUG nova.compute.manager [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Received event network-vif-deleted-b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.925 2 INFO nova.compute.manager [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Neutron deleted interface b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8; detaching it from the instance and deleting it from the info cache
Oct 06 14:06:26 compute-0 nova_compute[192903]: 2025-10-06 14:06:26.925 2 DEBUG nova.network.neutron [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:06:27 compute-0 nova_compute[192903]: 2025-10-06 14:06:27.119 2 DEBUG nova.network.neutron [-] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:06:27 compute-0 nova_compute[192903]: 2025-10-06 14:06:27.433 2 DEBUG nova.compute.manager [req-b48baba8-4b53-49b9-9c7d-dd140e91f29b req-2c50c85c-0e6f-405f-9c77-04b52b096b70 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Detach interface failed, port_id=b1f44cfa-c641-4bba-8edf-86a3ffc4c8b8, reason: Instance ea9b1b2c-e123-4a8b-a2ef-f29e14732d20 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:06:27 compute-0 nova_compute[192903]: 2025-10-06 14:06:27.625 2 INFO nova.compute.manager [-] [instance: ea9b1b2c-e123-4a8b-a2ef-f29e14732d20] Took 1.70 seconds to deallocate network for instance.
Oct 06 14:06:28 compute-0 nova_compute[192903]: 2025-10-06 14:06:28.146 2 DEBUG oslo_concurrency.lockutils [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:28 compute-0 nova_compute[192903]: 2025-10-06 14:06:28.147 2 DEBUG oslo_concurrency.lockutils [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:28 compute-0 podman[218770]: 2025-10-06 14:06:28.21732104 +0000 UTC m=+0.079573090 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 06 14:06:28 compute-0 nova_compute[192903]: 2025-10-06 14:06:28.230 2 DEBUG nova.compute.provider_tree [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:06:28 compute-0 nova_compute[192903]: 2025-10-06 14:06:28.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:28 compute-0 nova_compute[192903]: 2025-10-06 14:06:28.740 2 DEBUG nova.scheduler.client.report [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:06:29 compute-0 nova_compute[192903]: 2025-10-06 14:06:29.252 2 DEBUG oslo_concurrency.lockutils [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:29 compute-0 nova_compute[192903]: 2025-10-06 14:06:29.284 2 INFO nova.scheduler.client.report [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Deleted allocations for instance ea9b1b2c-e123-4a8b-a2ef-f29e14732d20
Oct 06 14:06:29 compute-0 podman[203308]: time="2025-10-06T14:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:06:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:06:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3465 "" "Go-http-client/1.1"
Oct 06 14:06:30 compute-0 nova_compute[192903]: 2025-10-06 14:06:30.315 2 DEBUG oslo_concurrency.lockutils [None req-af714203-3138-40a1-aa69-b627277249db 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "ea9b1b2c-e123-4a8b-a2ef-f29e14732d20" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.221s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:30 compute-0 nova_compute[192903]: 2025-10-06 14:06:30.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.020 2 DEBUG oslo_concurrency.lockutils [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.021 2 DEBUG oslo_concurrency.lockutils [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.021 2 DEBUG oslo_concurrency.lockutils [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.021 2 DEBUG oslo_concurrency.lockutils [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.022 2 DEBUG oslo_concurrency.lockutils [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.039 2 INFO nova.compute.manager [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Terminating instance
Oct 06 14:06:31 compute-0 podman[218791]: 2025-10-06 14:06:31.226101481 +0000 UTC m=+0.088388069 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Oct 06 14:06:31 compute-0 openstack_network_exporter[205500]: ERROR   14:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:06:31 compute-0 openstack_network_exporter[205500]: ERROR   14:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:06:31 compute-0 openstack_network_exporter[205500]: ERROR   14:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:06:31 compute-0 openstack_network_exporter[205500]: ERROR   14:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:06:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:06:31 compute-0 openstack_network_exporter[205500]: ERROR   14:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:06:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.557 2 DEBUG nova.compute.manager [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:06:31 compute-0 kernel: tap0d38f548-fe (unregistering): left promiscuous mode
Oct 06 14:06:31 compute-0 NetworkManager[52035]: <info>  [1759759591.5823] device (tap0d38f548-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:06:31 compute-0 ovn_controller[95205]: 2025-10-06T14:06:31Z|00081|binding|INFO|Releasing lport 0d38f548-fe64-428b-beab-0b96200911a7 from this chassis (sb_readonly=0)
Oct 06 14:06:31 compute-0 ovn_controller[95205]: 2025-10-06T14:06:31Z|00082|binding|INFO|Setting lport 0d38f548-fe64-428b-beab-0b96200911a7 down in Southbound
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:31 compute-0 ovn_controller[95205]: 2025-10-06T14:06:31Z|00083|binding|INFO|Removing iface tap0d38f548-fe ovn-installed in OVS
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.604 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:00:49 10.100.0.5'], port_security=['fa:16:3e:ad:00:49 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9453c63e-8e53-4d5f-9571-c0dfe2365ef9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=0d38f548-fe64-428b-beab-0b96200911a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.605 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 0d38f548-fe64-428b-beab-0b96200911a7 in datapath 69d92bff-38df-455c-b731-a2864652e2a5 unbound from our chassis
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.606 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69d92bff-38df-455c-b731-a2864652e2a5
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.631 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b11bf1a4-e336-4c79-8cda-cf55d3f6f1f3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:31 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Deactivated successfully.
Oct 06 14:06:31 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Consumed 4.693s CPU time.
Oct 06 14:06:31 compute-0 systemd-machined[152985]: Machine qemu-5-instance-00000006 terminated.
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.667 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[ec167628-c323-45c6-9b31-9b76a3939966]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.670 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[778cd652-0d09-4949-9572-9e00c8060b29]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.701 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[6060ea9e-a6fc-47b7-8938-b80f733684df]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.717 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[84187836-3647-4a90-b0f0-ab41faa741b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69d92bff-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:f3:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 2008, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 2008, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384394, 'reachable_time': 31438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218825, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.734 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[003a4351-0083-4e6a-8033-e767e1920dbd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384410, 'tstamp': 384410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218826, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap69d92bff-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384414, 'tstamp': 384414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218826, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.735 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.742 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69d92bff-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.742 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.742 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69d92bff-30, col_values=(('external_ids', {'iface-id': '4cb572c5-2fe1-4cc2-9aac-d044653b4542'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.743 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:06:31 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:31.744 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4ed4b607-f525-40b7-bdae-4991e327c892]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-69d92bff-38df-455c-b731-a2864652e2a5\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 69d92bff-38df-455c-b731-a2864652e2a5\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.829 2 INFO nova.virt.libvirt.driver [-] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Instance destroyed successfully.
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.830 2 DEBUG nova.objects.instance [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lazy-loading 'resources' on Instance uuid 9453c63e-8e53-4d5f-9571-c0dfe2365ef9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.881 2 DEBUG nova.compute.manager [req-9ff68387-440b-4271-a2d4-983ed8f0eb1c req-e74fb02a-2511-44a6-9abe-1d5152435cb8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Received event network-vif-unplugged-0d38f548-fe64-428b-beab-0b96200911a7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.882 2 DEBUG oslo_concurrency.lockutils [req-9ff68387-440b-4271-a2d4-983ed8f0eb1c req-e74fb02a-2511-44a6-9abe-1d5152435cb8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.882 2 DEBUG oslo_concurrency.lockutils [req-9ff68387-440b-4271-a2d4-983ed8f0eb1c req-e74fb02a-2511-44a6-9abe-1d5152435cb8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.882 2 DEBUG oslo_concurrency.lockutils [req-9ff68387-440b-4271-a2d4-983ed8f0eb1c req-e74fb02a-2511-44a6-9abe-1d5152435cb8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.883 2 DEBUG nova.compute.manager [req-9ff68387-440b-4271-a2d4-983ed8f0eb1c req-e74fb02a-2511-44a6-9abe-1d5152435cb8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] No waiting events found dispatching network-vif-unplugged-0d38f548-fe64-428b-beab-0b96200911a7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:31 compute-0 nova_compute[192903]: 2025-10-06 14:06:31.883 2 DEBUG nova.compute.manager [req-9ff68387-440b-4271-a2d4-983ed8f0eb1c req-e74fb02a-2511-44a6-9abe-1d5152435cb8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Received event network-vif-unplugged-0d38f548-fe64-428b-beab-0b96200911a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.338 2 DEBUG nova.virt.libvirt.vif [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:03:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1247272142',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1247272142',id=6,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:03:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-dvjs40vx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:05:51Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=9453c63e-8e53-4d5f-9571-c0dfe2365ef9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d38f548-fe64-428b-beab-0b96200911a7", "address": "fa:16:3e:ad:00:49", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d38f548-fe", "ovs_interfaceid": "0d38f548-fe64-428b-beab-0b96200911a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.338 2 DEBUG nova.network.os_vif_util [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "0d38f548-fe64-428b-beab-0b96200911a7", "address": "fa:16:3e:ad:00:49", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d38f548-fe", "ovs_interfaceid": "0d38f548-fe64-428b-beab-0b96200911a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.339 2 DEBUG nova.network.os_vif_util [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:00:49,bridge_name='br-int',has_traffic_filtering=True,id=0d38f548-fe64-428b-beab-0b96200911a7,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d38f548-fe') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.339 2 DEBUG os_vif [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:00:49,bridge_name='br-int',has_traffic_filtering=True,id=0d38f548-fe64-428b-beab-0b96200911a7,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d38f548-fe') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.340 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d38f548-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=1395c2c7-da8f-414f-8bec-13d21e18cc6c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.348 2 INFO os_vif [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:00:49,bridge_name='br-int',has_traffic_filtering=True,id=0d38f548-fe64-428b-beab-0b96200911a7,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d38f548-fe')
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.348 2 INFO nova.virt.libvirt.driver [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Deleting instance files /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9_del
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.349 2 INFO nova.virt.libvirt.driver [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Deletion of /var/lib/nova/instances/9453c63e-8e53-4d5f-9571-c0dfe2365ef9_del complete
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.861 2 INFO nova.compute.manager [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Took 1.30 seconds to destroy the instance on the hypervisor.
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.862 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.862 2 DEBUG nova.compute.manager [-] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.863 2 DEBUG nova.network.neutron [-] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:06:32 compute-0 nova_compute[192903]: 2025-10-06 14:06:32.863 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:33 compute-0 nova_compute[192903]: 2025-10-06 14:06:33.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:33 compute-0 nova_compute[192903]: 2025-10-06 14:06:33.820 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:33 compute-0 nova_compute[192903]: 2025-10-06 14:06:33.956 2 DEBUG nova.compute.manager [req-4b24f89a-cc94-4d18-b7f8-3ab6d4abfcf3 req-7c3d4b20-3951-402d-a33f-2e75e1eb4cb2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Received event network-vif-unplugged-0d38f548-fe64-428b-beab-0b96200911a7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:33 compute-0 nova_compute[192903]: 2025-10-06 14:06:33.956 2 DEBUG oslo_concurrency.lockutils [req-4b24f89a-cc94-4d18-b7f8-3ab6d4abfcf3 req-7c3d4b20-3951-402d-a33f-2e75e1eb4cb2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:33 compute-0 nova_compute[192903]: 2025-10-06 14:06:33.957 2 DEBUG oslo_concurrency.lockutils [req-4b24f89a-cc94-4d18-b7f8-3ab6d4abfcf3 req-7c3d4b20-3951-402d-a33f-2e75e1eb4cb2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:33 compute-0 nova_compute[192903]: 2025-10-06 14:06:33.957 2 DEBUG oslo_concurrency.lockutils [req-4b24f89a-cc94-4d18-b7f8-3ab6d4abfcf3 req-7c3d4b20-3951-402d-a33f-2e75e1eb4cb2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:33 compute-0 nova_compute[192903]: 2025-10-06 14:06:33.957 2 DEBUG nova.compute.manager [req-4b24f89a-cc94-4d18-b7f8-3ab6d4abfcf3 req-7c3d4b20-3951-402d-a33f-2e75e1eb4cb2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] No waiting events found dispatching network-vif-unplugged-0d38f548-fe64-428b-beab-0b96200911a7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:33 compute-0 nova_compute[192903]: 2025-10-06 14:06:33.957 2 DEBUG nova.compute.manager [req-4b24f89a-cc94-4d18-b7f8-3ab6d4abfcf3 req-7c3d4b20-3951-402d-a33f-2e75e1eb4cb2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Received event network-vif-unplugged-0d38f548-fe64-428b-beab-0b96200911a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:06:34 compute-0 nova_compute[192903]: 2025-10-06 14:06:34.909 2 DEBUG nova.network.neutron [-] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:06:35 compute-0 nova_compute[192903]: 2025-10-06 14:06:35.414 2 INFO nova.compute.manager [-] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Took 2.55 seconds to deallocate network for instance.
Oct 06 14:06:35 compute-0 nova_compute[192903]: 2025-10-06 14:06:35.936 2 DEBUG oslo_concurrency.lockutils [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:35 compute-0 nova_compute[192903]: 2025-10-06 14:06:35.937 2 DEBUG oslo_concurrency.lockutils [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:35 compute-0 nova_compute[192903]: 2025-10-06 14:06:35.978 2 DEBUG nova.scheduler.client.report [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 14:06:36 compute-0 nova_compute[192903]: 2025-10-06 14:06:36.005 2 DEBUG nova.scheduler.client.report [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 14:06:36 compute-0 nova_compute[192903]: 2025-10-06 14:06:36.006 2 DEBUG nova.compute.provider_tree [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:06:36 compute-0 nova_compute[192903]: 2025-10-06 14:06:36.019 2 DEBUG nova.scheduler.client.report [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 14:06:36 compute-0 nova_compute[192903]: 2025-10-06 14:06:36.026 2 DEBUG nova.compute.manager [req-c4cd5db2-0f14-4910-9fdd-d5e2dd44d782 req-088954b2-dd9d-4d25-a1a7-76bd289bbc58 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 9453c63e-8e53-4d5f-9571-c0dfe2365ef9] Received event network-vif-deleted-0d38f548-fe64-428b-beab-0b96200911a7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:36 compute-0 nova_compute[192903]: 2025-10-06 14:06:36.048 2 DEBUG nova.scheduler.client.report [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 14:06:36 compute-0 nova_compute[192903]: 2025-10-06 14:06:36.125 2 DEBUG nova.compute.provider_tree [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:06:36 compute-0 nova_compute[192903]: 2025-10-06 14:06:36.637 2 DEBUG nova.scheduler.client.report [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:06:37 compute-0 nova_compute[192903]: 2025-10-06 14:06:37.148 2 DEBUG oslo_concurrency.lockutils [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.211s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:37 compute-0 nova_compute[192903]: 2025-10-06 14:06:37.167 2 INFO nova.scheduler.client.report [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Deleted allocations for instance 9453c63e-8e53-4d5f-9571-c0dfe2365ef9
Oct 06 14:06:37 compute-0 nova_compute[192903]: 2025-10-06 14:06:37.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:38 compute-0 nova_compute[192903]: 2025-10-06 14:06:38.198 2 DEBUG oslo_concurrency.lockutils [None req-b65cd5de-9de2-4edc-9c33-84a00cce1c37 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "9453c63e-8e53-4d5f-9571-c0dfe2365ef9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.178s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:38 compute-0 nova_compute[192903]: 2025-10-06 14:06:38.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:39 compute-0 nova_compute[192903]: 2025-10-06 14:06:39.969 2 DEBUG oslo_concurrency.lockutils [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:39 compute-0 nova_compute[192903]: 2025-10-06 14:06:39.970 2 DEBUG oslo_concurrency.lockutils [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:39 compute-0 nova_compute[192903]: 2025-10-06 14:06:39.971 2 DEBUG oslo_concurrency.lockutils [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:39 compute-0 nova_compute[192903]: 2025-10-06 14:06:39.971 2 DEBUG oslo_concurrency.lockutils [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:39 compute-0 nova_compute[192903]: 2025-10-06 14:06:39.972 2 DEBUG oslo_concurrency.lockutils [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:39 compute-0 nova_compute[192903]: 2025-10-06 14:06:39.986 2 INFO nova.compute.manager [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Terminating instance
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.505 2 DEBUG nova.compute.manager [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:06:40 compute-0 kernel: tap367788c3-83 (unregistering): left promiscuous mode
Oct 06 14:06:40 compute-0 NetworkManager[52035]: <info>  [1759759600.5288] device (tap367788c3-83): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:06:40 compute-0 ovn_controller[95205]: 2025-10-06T14:06:40Z|00084|binding|INFO|Releasing lport 367788c3-83c2-4360-a817-da04de69a6a2 from this chassis (sb_readonly=0)
Oct 06 14:06:40 compute-0 ovn_controller[95205]: 2025-10-06T14:06:40Z|00085|binding|INFO|Setting lport 367788c3-83c2-4360-a817-da04de69a6a2 down in Southbound
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:40 compute-0 ovn_controller[95205]: 2025-10-06T14:06:40Z|00086|binding|INFO|Removing iface tap367788c3-83 ovn-installed in OVS
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.551 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:91:80 10.100.0.8'], port_security=['fa:16:3e:7b:91:80 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '46246aa4-aa4f-4a8e-93ba-5fc685a531a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69d92bff-38df-455c-b731-a2864652e2a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20952eb66a9c4fd2905273fb8f800689', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'fb73afaa-d848-4024-8ddb-c9e9b62d7d4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbff1ef9-0a9d-4ab1-8784-5e2a9c678396, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=367788c3-83c2-4360-a817-da04de69a6a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.552 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 367788c3-83c2-4360-a817-da04de69a6a2 in datapath 69d92bff-38df-455c-b731-a2864652e2a5 unbound from our chassis
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.554 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 69d92bff-38df-455c-b731-a2864652e2a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.555 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[20bed3ab-dd6f-4be3-918c-53266e9b1e9a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.555 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5 namespace which is not needed anymore
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:40 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Oct 06 14:06:40 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 22.723s CPU time.
Oct 06 14:06:40 compute-0 systemd-machined[152985]: Machine qemu-2-instance-00000005 terminated.
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.685 2 DEBUG nova.compute.manager [req-67ca9ed5-f40c-4e1f-b4a4-92d9a45ab8a4 req-3e6ae006-7c0c-41ef-96d3-bab42523c914 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Received event network-vif-unplugged-367788c3-83c2-4360-a817-da04de69a6a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.686 2 DEBUG oslo_concurrency.lockutils [req-67ca9ed5-f40c-4e1f-b4a4-92d9a45ab8a4 req-3e6ae006-7c0c-41ef-96d3-bab42523c914 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.686 2 DEBUG oslo_concurrency.lockutils [req-67ca9ed5-f40c-4e1f-b4a4-92d9a45ab8a4 req-3e6ae006-7c0c-41ef-96d3-bab42523c914 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.687 2 DEBUG oslo_concurrency.lockutils [req-67ca9ed5-f40c-4e1f-b4a4-92d9a45ab8a4 req-3e6ae006-7c0c-41ef-96d3-bab42523c914 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.687 2 DEBUG nova.compute.manager [req-67ca9ed5-f40c-4e1f-b4a4-92d9a45ab8a4 req-3e6ae006-7c0c-41ef-96d3-bab42523c914 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] No waiting events found dispatching network-vif-unplugged-367788c3-83c2-4360-a817-da04de69a6a2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.687 2 DEBUG nova.compute.manager [req-67ca9ed5-f40c-4e1f-b4a4-92d9a45ab8a4 req-3e6ae006-7c0c-41ef-96d3-bab42523c914 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Received event network-vif-unplugged-367788c3-83c2-4360-a817-da04de69a6a2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:06:40 compute-0 neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5[217353]: [NOTICE]   (217357) : haproxy version is 3.0.5-8e879a5
Oct 06 14:06:40 compute-0 neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5[217353]: [NOTICE]   (217357) : path to executable is /usr/sbin/haproxy
Oct 06 14:06:40 compute-0 neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5[217353]: [WARNING]  (217357) : Exiting Master process...
Oct 06 14:06:40 compute-0 podman[218870]: 2025-10-06 14:06:40.727782114 +0000 UTC m=+0.049747270 container kill 00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 06 14:06:40 compute-0 neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5[217353]: [ALERT]    (217357) : Current worker (217359) exited with code 143 (Terminated)
Oct 06 14:06:40 compute-0 neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5[217353]: [WARNING]  (217357) : All workers exited. Exiting... (0)
Oct 06 14:06:40 compute-0 systemd[1]: libpod-00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00.scope: Deactivated successfully.
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.783 2 INFO nova.virt.libvirt.driver [-] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Instance destroyed successfully.
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.784 2 DEBUG nova.objects.instance [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lazy-loading 'resources' on Instance uuid 46246aa4-aa4f-4a8e-93ba-5fc685a531a0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:06:40 compute-0 podman[218891]: 2025-10-06 14:06:40.807880647 +0000 UTC m=+0.051003095 container died 00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:06:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00-userdata-shm.mount: Deactivated successfully.
Oct 06 14:06:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-6c57c60c8a133e7bdce6c167312b77fdf0a4f4f4def04b71056d062caa7e380d-merged.mount: Deactivated successfully.
Oct 06 14:06:40 compute-0 podman[218891]: 2025-10-06 14:06:40.850315598 +0000 UTC m=+0.093437976 container cleanup 00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 14:06:40 compute-0 systemd[1]: libpod-conmon-00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00.scope: Deactivated successfully.
Oct 06 14:06:40 compute-0 podman[218898]: 2025-10-06 14:06:40.874095713 +0000 UTC m=+0.099759807 container remove 00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.882 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e7882abb-e500-4dbc-9a61-f64c212ff111]: (4, ("Mon Oct  6 02:06:40 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5 (00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00)\n00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00\nMon Oct  6 02:06:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5 (00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00)\n00557db8af820e359abf69309f35835a1a0b095346e615b497b6b82163be5f00\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.883 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[653eb8fb-0530-4b76-99aa-b76c167089f6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.883 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/69d92bff-38df-455c-b731-a2864652e2a5.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.884 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[31c6a9f2-c4c3-4bd0-bcb9-87697fc0369b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.885 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69d92bff-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:40 compute-0 kernel: tap69d92bff-30: left promiscuous mode
Oct 06 14:06:40 compute-0 nova_compute[192903]: 2025-10-06 14:06:40.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.912 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[70198b54-7bfc-4eae-9e87-902cef778fad]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.942 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bc9e746f-3275-4121-97f1-a07903f9f341]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.943 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa6b7c2-d612-4252-ab5d-ff3ccac82b1c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.968 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b296971d-208a-450a-ad4c-51d1637ad67a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384384, 'reachable_time': 27640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218935, 'error': None, 'target': 'ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.971 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-69d92bff-38df-455c-b731-a2864652e2a5 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:06:40 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:40.972 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0e9c01-eabf-46ac-a41f-00ff5b96b6e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:06:40 compute-0 systemd[1]: run-netns-ovnmeta\x2d69d92bff\x2d38df\x2d455c\x2db731\x2da2864652e2a5.mount: Deactivated successfully.
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.291 2 DEBUG nova.virt.libvirt.vif [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:02:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-669924934',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-669924934',id=5,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:03:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='20952eb66a9c4fd2905273fb8f800689',ramdisk_id='',reservation_id='r-db5hsdgf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1260248176',owner_user_name='tempest-TestExecuteActionsViaActuator-1260248176-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:03:04Z,user_data=None,user_id='4beaed30a2ec47bb9b5f6adb81ede0f7',uuid=46246aa4-aa4f-4a8e-93ba-5fc685a531a0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.293 2 DEBUG nova.network.os_vif_util [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converting VIF {"id": "367788c3-83c2-4360-a817-da04de69a6a2", "address": "fa:16:3e:7b:91:80", "network": {"id": "69d92bff-38df-455c-b731-a2864652e2a5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1750716857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f79c2b9daff04f20aa823813dfdde9e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap367788c3-83", "ovs_interfaceid": "367788c3-83c2-4360-a817-da04de69a6a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.294 2 DEBUG nova.network.os_vif_util [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:91:80,bridge_name='br-int',has_traffic_filtering=True,id=367788c3-83c2-4360-a817-da04de69a6a2,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap367788c3-83') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.295 2 DEBUG os_vif [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:91:80,bridge_name='br-int',has_traffic_filtering=True,id=367788c3-83c2-4360-a817-da04de69a6a2,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap367788c3-83') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.297 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap367788c3-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.303 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=09ed6a01-bf06-481e-9c73-3250a0051f5c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.308 2 INFO os_vif [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:91:80,bridge_name='br-int',has_traffic_filtering=True,id=367788c3-83c2-4360-a817-da04de69a6a2,network=Network(69d92bff-38df-455c-b731-a2864652e2a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap367788c3-83')
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.309 2 INFO nova.virt.libvirt.driver [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Deleting instance files /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0_del
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.310 2 INFO nova.virt.libvirt.driver [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Deletion of /var/lib/nova/instances/46246aa4-aa4f-4a8e-93ba-5fc685a531a0_del complete
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.826 2 INFO nova.compute.manager [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.827 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.827 2 DEBUG nova.compute.manager [-] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.828 2 DEBUG nova.network.neutron [-] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:06:41 compute-0 nova_compute[192903]: 2025-10-06 14:06:41.828 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:42 compute-0 nova_compute[192903]: 2025-10-06 14:06:42.730 2 DEBUG nova.compute.manager [req-9cf35186-d44f-4810-a468-68fe9fad89e8 req-13ae1412-8efc-4c89-a6c0-63bcf20f789e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Received event network-vif-unplugged-367788c3-83c2-4360-a817-da04de69a6a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:42 compute-0 nova_compute[192903]: 2025-10-06 14:06:42.731 2 DEBUG oslo_concurrency.lockutils [req-9cf35186-d44f-4810-a468-68fe9fad89e8 req-13ae1412-8efc-4c89-a6c0-63bcf20f789e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:42 compute-0 nova_compute[192903]: 2025-10-06 14:06:42.731 2 DEBUG oslo_concurrency.lockutils [req-9cf35186-d44f-4810-a468-68fe9fad89e8 req-13ae1412-8efc-4c89-a6c0-63bcf20f789e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:42 compute-0 nova_compute[192903]: 2025-10-06 14:06:42.732 2 DEBUG oslo_concurrency.lockutils [req-9cf35186-d44f-4810-a468-68fe9fad89e8 req-13ae1412-8efc-4c89-a6c0-63bcf20f789e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:42 compute-0 nova_compute[192903]: 2025-10-06 14:06:42.732 2 DEBUG nova.compute.manager [req-9cf35186-d44f-4810-a468-68fe9fad89e8 req-13ae1412-8efc-4c89-a6c0-63bcf20f789e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] No waiting events found dispatching network-vif-unplugged-367788c3-83c2-4360-a817-da04de69a6a2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:06:42 compute-0 nova_compute[192903]: 2025-10-06 14:06:42.733 2 DEBUG nova.compute.manager [req-9cf35186-d44f-4810-a468-68fe9fad89e8 req-13ae1412-8efc-4c89-a6c0-63bcf20f789e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Received event network-vif-unplugged-367788c3-83c2-4360-a817-da04de69a6a2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:06:42 compute-0 nova_compute[192903]: 2025-10-06 14:06:42.812 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:06:43 compute-0 nova_compute[192903]: 2025-10-06 14:06:43.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:44 compute-0 nova_compute[192903]: 2025-10-06 14:06:44.302 2 DEBUG nova.compute.manager [req-92011c02-eed3-49f8-a8bf-d2ce2f3ab559 req-694c9261-0d1b-4733-aac6-24cea15b0d05 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Received event network-vif-deleted-367788c3-83c2-4360-a817-da04de69a6a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:06:44 compute-0 nova_compute[192903]: 2025-10-06 14:06:44.303 2 INFO nova.compute.manager [req-92011c02-eed3-49f8-a8bf-d2ce2f3ab559 req-694c9261-0d1b-4733-aac6-24cea15b0d05 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Neutron deleted interface 367788c3-83c2-4360-a817-da04de69a6a2; detaching it from the instance and deleting it from the info cache
Oct 06 14:06:44 compute-0 nova_compute[192903]: 2025-10-06 14:06:44.303 2 DEBUG nova.network.neutron [req-92011c02-eed3-49f8-a8bf-d2ce2f3ab559 req-694c9261-0d1b-4733-aac6-24cea15b0d05 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:06:44 compute-0 nova_compute[192903]: 2025-10-06 14:06:44.763 2 DEBUG nova.network.neutron [-] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:06:44 compute-0 nova_compute[192903]: 2025-10-06 14:06:44.810 2 DEBUG nova.compute.manager [req-92011c02-eed3-49f8-a8bf-d2ce2f3ab559 req-694c9261-0d1b-4733-aac6-24cea15b0d05 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Detach interface failed, port_id=367788c3-83c2-4360-a817-da04de69a6a2, reason: Instance 46246aa4-aa4f-4a8e-93ba-5fc685a531a0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:06:45 compute-0 nova_compute[192903]: 2025-10-06 14:06:45.271 2 INFO nova.compute.manager [-] [instance: 46246aa4-aa4f-4a8e-93ba-5fc685a531a0] Took 3.44 seconds to deallocate network for instance.
Oct 06 14:06:45 compute-0 nova_compute[192903]: 2025-10-06 14:06:45.798 2 DEBUG oslo_concurrency.lockutils [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:45 compute-0 nova_compute[192903]: 2025-10-06 14:06:45.799 2 DEBUG oslo_concurrency.lockutils [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:45 compute-0 nova_compute[192903]: 2025-10-06 14:06:45.870 2 DEBUG nova.compute.provider_tree [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:06:46 compute-0 podman[218936]: 2025-10-06 14:06:46.203537208 +0000 UTC m=+0.065535475 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:06:46 compute-0 nova_compute[192903]: 2025-10-06 14:06:46.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:46 compute-0 nova_compute[192903]: 2025-10-06 14:06:46.379 2 DEBUG nova.scheduler.client.report [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:06:46 compute-0 nova_compute[192903]: 2025-10-06 14:06:46.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:06:46 compute-0 nova_compute[192903]: 2025-10-06 14:06:46.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:06:46 compute-0 nova_compute[192903]: 2025-10-06 14:06:46.891 2 DEBUG oslo_concurrency.lockutils [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:46 compute-0 nova_compute[192903]: 2025-10-06 14:06:46.916 2 INFO nova.scheduler.client.report [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Deleted allocations for instance 46246aa4-aa4f-4a8e-93ba-5fc685a531a0
Oct 06 14:06:47 compute-0 nova_compute[192903]: 2025-10-06 14:06:47.947 2 DEBUG oslo_concurrency.lockutils [None req-5f61bcd5-02c0-40f4-ab71-fdf6a3ddd00a 4beaed30a2ec47bb9b5f6adb81ede0f7 20952eb66a9c4fd2905273fb8f800689 - - default default] Lock "46246aa4-aa4f-4a8e-93ba-5fc685a531a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.977s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:48 compute-0 nova_compute[192903]: 2025-10-06 14:06:48.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:49 compute-0 podman[218963]: 2025-10-06 14:06:49.21345369 +0000 UTC m=+0.062263272 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 06 14:06:49 compute-0 podman[218962]: 2025-10-06 14:06:49.223262569 +0000 UTC m=+0.076160165 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 06 14:06:49 compute-0 podman[218961]: 2025-10-06 14:06:49.264426274 +0000 UTC m=+0.119689990 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, container_name=ovn_controller)
Oct 06 14:06:49 compute-0 nova_compute[192903]: 2025-10-06 14:06:49.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:06:50 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:50.020 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:06:50 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:50.021 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:06:50 compute-0 nova_compute[192903]: 2025-10-06 14:06:50.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:51 compute-0 nova_compute[192903]: 2025-10-06 14:06:51.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:51 compute-0 nova_compute[192903]: 2025-10-06 14:06:51.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:06:51 compute-0 nova_compute[192903]: 2025-10-06 14:06:51.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:06:51 compute-0 nova_compute[192903]: 2025-10-06 14:06:51.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.099 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.253 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.254 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.295 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.296 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5863MB free_disk=73.30595779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.296 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:06:52 compute-0 nova_compute[192903]: 2025-10-06 14:06:52.296 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:06:53 compute-0 nova_compute[192903]: 2025-10-06 14:06:53.333 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:06:53 compute-0 nova_compute[192903]: 2025-10-06 14:06:53.334 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:06:52 up  1:07,  0 user,  load average: 0.87, 0.51, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:06:53 compute-0 nova_compute[192903]: 2025-10-06 14:06:53.352 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:06:53 compute-0 nova_compute[192903]: 2025-10-06 14:06:53.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:53 compute-0 nova_compute[192903]: 2025-10-06 14:06:53.859 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:06:54 compute-0 nova_compute[192903]: 2025-10-06 14:06:54.370 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:06:54 compute-0 nova_compute[192903]: 2025-10-06 14:06:54.370 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.074s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:06:55 compute-0 nova_compute[192903]: 2025-10-06 14:06:55.366 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:06:55 compute-0 nova_compute[192903]: 2025-10-06 14:06:55.877 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:06:55 compute-0 nova_compute[192903]: 2025-10-06 14:06:55.878 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:06:55 compute-0 nova_compute[192903]: 2025-10-06 14:06:55.878 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:06:56 compute-0 nova_compute[192903]: 2025-10-06 14:06:56.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:06:58.022 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:06:58 compute-0 nova_compute[192903]: 2025-10-06 14:06:58.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:06:59 compute-0 podman[219030]: 2025-10-06 14:06:59.220871462 +0000 UTC m=+0.077249093 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.build-date=20250930, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 06 14:06:59 compute-0 podman[203308]: time="2025-10-06T14:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:06:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:06:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3003 "" "Go-http-client/1.1"
Oct 06 14:07:00 compute-0 nova_compute[192903]: 2025-10-06 14:07:00.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:01 compute-0 nova_compute[192903]: 2025-10-06 14:07:01.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:01 compute-0 openstack_network_exporter[205500]: ERROR   14:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:07:01 compute-0 openstack_network_exporter[205500]: ERROR   14:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:07:01 compute-0 openstack_network_exporter[205500]: ERROR   14:07:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:07:01 compute-0 openstack_network_exporter[205500]: ERROR   14:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:07:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:07:01 compute-0 openstack_network_exporter[205500]: ERROR   14:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:07:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:07:02 compute-0 podman[219052]: 2025-10-06 14:07:02.215566866 +0000 UTC m=+0.069161447 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Oct 06 14:07:03 compute-0 nova_compute[192903]: 2025-10-06 14:07:03.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:06 compute-0 nova_compute[192903]: 2025-10-06 14:07:06.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:08 compute-0 nova_compute[192903]: 2025-10-06 14:07:08.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:09.904 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:25:16 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37fbf627b5a647e5a616e5d55c765875', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb873b11-6bc3-4cba-8b83-39f2042a0d3f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1c2f1045-4132-4ec7-9b93-34567076228a) old=Port_Binding(mac=['fa:16:3e:ab:25:16'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37fbf627b5a647e5a616e5d55c765875', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:07:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:09.904 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1c2f1045-4132-4ec7-9b93-34567076228a in datapath f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 updated
Oct 06 14:07:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:09.905 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:07:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:09.906 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[f19dc8fd-dcf7-4f53-b15b-68b43812abb6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:07:11 compute-0 nova_compute[192903]: 2025-10-06 14:07:11.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:11.361 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:07:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:11.362 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:07:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:11.362 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:07:13 compute-0 nova_compute[192903]: 2025-10-06 14:07:13.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:16 compute-0 nova_compute[192903]: 2025-10-06 14:07:16.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:16.708 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:ba:db 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-455a2612-8c18-4e3e-afb2-7b840da384de', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455a2612-8c18-4e3e-afb2-7b840da384de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5755f5f126624f6b82371d76f860b4cc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4daf014e-e457-46d1-878f-9eb472e3190d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0128fb48-cb14-4780-84b1-7dc3d595d6ed) old=Port_Binding(mac=['fa:16:3e:a5:ba:db'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-455a2612-8c18-4e3e-afb2-7b840da384de', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455a2612-8c18-4e3e-afb2-7b840da384de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5755f5f126624f6b82371d76f860b4cc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:07:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:16.709 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0128fb48-cb14-4780-84b1-7dc3d595d6ed in datapath 455a2612-8c18-4e3e-afb2-7b840da384de updated
Oct 06 14:07:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:16.711 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 455a2612-8c18-4e3e-afb2-7b840da384de, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:07:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:16.712 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1c88da5b-948a-4744-9c38-6b308de0981e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:07:17 compute-0 podman[219074]: 2025-10-06 14:07:17.218488668 +0000 UTC m=+0.072716648 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:07:18 compute-0 nova_compute[192903]: 2025-10-06 14:07:18.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:20 compute-0 podman[219099]: 2025-10-06 14:07:20.23595016 +0000 UTC m=+0.083045170 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 06 14:07:20 compute-0 podman[219100]: 2025-10-06 14:07:20.293572684 +0000 UTC m=+0.134663491 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 06 14:07:20 compute-0 podman[219098]: 2025-10-06 14:07:20.295986595 +0000 UTC m=+0.150239117 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:07:21 compute-0 nova_compute[192903]: 2025-10-06 14:07:21.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:23 compute-0 nova_compute[192903]: 2025-10-06 14:07:23.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:26 compute-0 nova_compute[192903]: 2025-10-06 14:07:26.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:28 compute-0 nova_compute[192903]: 2025-10-06 14:07:28.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:29 compute-0 podman[203308]: time="2025-10-06T14:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:07:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:07:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Oct 06 14:07:30 compute-0 podman[219166]: 2025-10-06 14:07:30.209058556 +0000 UTC m=+0.075675613 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid)
Oct 06 14:07:31 compute-0 nova_compute[192903]: 2025-10-06 14:07:31.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:31 compute-0 openstack_network_exporter[205500]: ERROR   14:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:07:31 compute-0 openstack_network_exporter[205500]: ERROR   14:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:07:31 compute-0 openstack_network_exporter[205500]: ERROR   14:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:07:31 compute-0 openstack_network_exporter[205500]: ERROR   14:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:07:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:07:31 compute-0 openstack_network_exporter[205500]: ERROR   14:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:07:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:07:32 compute-0 ovn_controller[95205]: 2025-10-06T14:07:32Z|00087|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct 06 14:07:33 compute-0 podman[219188]: 2025-10-06 14:07:33.237706272 +0000 UTC m=+0.093401983 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal)
Oct 06 14:07:33 compute-0 nova_compute[192903]: 2025-10-06 14:07:33.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:36 compute-0 nova_compute[192903]: 2025-10-06 14:07:36.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:36.834 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:07:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:36.835 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:07:36 compute-0 nova_compute[192903]: 2025-10-06 14:07:36.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:38 compute-0 nova_compute[192903]: 2025-10-06 14:07:38.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:41 compute-0 nova_compute[192903]: 2025-10-06 14:07:41.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:42 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 06 14:07:43 compute-0 nova_compute[192903]: 2025-10-06 14:07:43.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:45 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:07:45.836 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:07:46 compute-0 nova_compute[192903]: 2025-10-06 14:07:46.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:47 compute-0 nova_compute[192903]: 2025-10-06 14:07:47.090 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:07:47 compute-0 nova_compute[192903]: 2025-10-06 14:07:47.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:07:48 compute-0 podman[219211]: 2025-10-06 14:07:48.188685362 +0000 UTC m=+0.057313336 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:07:48 compute-0 nova_compute[192903]: 2025-10-06 14:07:48.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:51 compute-0 podman[219238]: 2025-10-06 14:07:51.20553794 +0000 UTC m=+0.060005124 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Oct 06 14:07:51 compute-0 podman[219236]: 2025-10-06 14:07:51.217451853 +0000 UTC m=+0.081742997 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 14:07:51 compute-0 podman[219237]: 2025-10-06 14:07:51.247698111 +0000 UTC m=+0.099544609 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:07:51 compute-0 nova_compute[192903]: 2025-10-06 14:07:51.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:51 compute-0 nova_compute[192903]: 2025-10-06 14:07:51.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:07:51 compute-0 nova_compute[192903]: 2025-10-06 14:07:51.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.097 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.259 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.260 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.288 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.289 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5863MB free_disk=73.30595779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.289 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.289 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.567 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "ed5ab92d-5355-4703-8afc-71d5bea99132" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:07:52 compute-0 nova_compute[192903]: 2025-10-06 14:07:52.568 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:07:53 compute-0 nova_compute[192903]: 2025-10-06 14:07:53.074 2 DEBUG nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:07:53 compute-0 nova_compute[192903]: 2025-10-06 14:07:53.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:53 compute-0 nova_compute[192903]: 2025-10-06 14:07:53.621 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:07:53 compute-0 nova_compute[192903]: 2025-10-06 14:07:53.838 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance ed5ab92d-5355-4703-8afc-71d5bea99132 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1797
Oct 06 14:07:53 compute-0 nova_compute[192903]: 2025-10-06 14:07:53.838 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:07:53 compute-0 nova_compute[192903]: 2025-10-06 14:07:53.838 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:07:52 up  1:08,  0 user,  load average: 0.45, 0.47, 0.44\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:07:53 compute-0 nova_compute[192903]: 2025-10-06 14:07:53.880 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:07:54 compute-0 nova_compute[192903]: 2025-10-06 14:07:54.385 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:07:54 compute-0 nova_compute[192903]: 2025-10-06 14:07:54.894 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:07:54 compute-0 nova_compute[192903]: 2025-10-06 14:07:54.895 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.606s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:07:54 compute-0 nova_compute[192903]: 2025-10-06 14:07:54.895 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.274s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:07:54 compute-0 nova_compute[192903]: 2025-10-06 14:07:54.903 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:07:54 compute-0 nova_compute[192903]: 2025-10-06 14:07:54.903 2 INFO nova.compute.claims [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:07:55 compute-0 nova_compute[192903]: 2025-10-06 14:07:55.894 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:07:55 compute-0 nova_compute[192903]: 2025-10-06 14:07:55.895 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:07:55 compute-0 nova_compute[192903]: 2025-10-06 14:07:55.895 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:07:55 compute-0 nova_compute[192903]: 2025-10-06 14:07:55.895 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:07:55 compute-0 nova_compute[192903]: 2025-10-06 14:07:55.959 2 DEBUG nova.compute.provider_tree [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:07:56 compute-0 nova_compute[192903]: 2025-10-06 14:07:56.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:56 compute-0 nova_compute[192903]: 2025-10-06 14:07:56.467 2 DEBUG nova.scheduler.client.report [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:07:56 compute-0 nova_compute[192903]: 2025-10-06 14:07:56.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:07:56 compute-0 nova_compute[192903]: 2025-10-06 14:07:56.978 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.082s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:07:56 compute-0 nova_compute[192903]: 2025-10-06 14:07:56.979 2 DEBUG nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:07:57 compute-0 nova_compute[192903]: 2025-10-06 14:07:57.492 2 DEBUG nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:07:57 compute-0 nova_compute[192903]: 2025-10-06 14:07:57.492 2 DEBUG nova.network.neutron [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:07:57 compute-0 nova_compute[192903]: 2025-10-06 14:07:57.493 2 WARNING neutronclient.v2_0.client [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:07:57 compute-0 nova_compute[192903]: 2025-10-06 14:07:57.493 2 WARNING neutronclient.v2_0.client [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:07:57 compute-0 nova_compute[192903]: 2025-10-06 14:07:57.985 2 DEBUG nova.network.neutron [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Successfully created port: b40989b0-f6eb-4f13-8c80-7c66fdbc387a _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:07:58 compute-0 nova_compute[192903]: 2025-10-06 14:07:58.001 2 INFO nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:07:58 compute-0 nova_compute[192903]: 2025-10-06 14:07:58.511 2 DEBUG nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:07:58 compute-0 nova_compute[192903]: 2025-10-06 14:07:58.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.530 2 DEBUG nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.532 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.533 2 INFO nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Creating image(s)
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.534 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "/var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.534 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "/var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.536 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "/var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.537 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.543 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.545 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.603 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.604 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.604 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.605 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.607 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.608 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.668 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.669 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.721 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.723 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.723 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:07:59 compute-0 podman[203308]: time="2025-10-06T14:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:07:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:07:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3003 "" "Go-http-client/1.1"
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.817 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.818 2 DEBUG nova.virt.disk.api [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Checking if we can resize image /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.819 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.880 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.882 2 DEBUG nova.virt.disk.api [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Cannot resize image /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.883 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.883 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Ensure instance console log exists: /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.884 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.884 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:07:59 compute-0 nova_compute[192903]: 2025-10-06 14:07:59.885 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:08:00 compute-0 nova_compute[192903]: 2025-10-06 14:08:00.035 2 DEBUG nova.network.neutron [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Successfully updated port: b40989b0-f6eb-4f13-8c80-7c66fdbc387a _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:08:00 compute-0 nova_compute[192903]: 2025-10-06 14:08:00.086 2 DEBUG nova.compute.manager [req-68f53ae3-35eb-4e7a-8dc1-0d523e1fbce0 req-c6357aac-fbfc-45de-a763-010929ffbf7d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Received event network-changed-b40989b0-f6eb-4f13-8c80-7c66fdbc387a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:08:00 compute-0 nova_compute[192903]: 2025-10-06 14:08:00.086 2 DEBUG nova.compute.manager [req-68f53ae3-35eb-4e7a-8dc1-0d523e1fbce0 req-c6357aac-fbfc-45de-a763-010929ffbf7d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Refreshing instance network info cache due to event network-changed-b40989b0-f6eb-4f13-8c80-7c66fdbc387a. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:08:00 compute-0 nova_compute[192903]: 2025-10-06 14:08:00.087 2 DEBUG oslo_concurrency.lockutils [req-68f53ae3-35eb-4e7a-8dc1-0d523e1fbce0 req-c6357aac-fbfc-45de-a763-010929ffbf7d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-ed5ab92d-5355-4703-8afc-71d5bea99132" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:08:00 compute-0 nova_compute[192903]: 2025-10-06 14:08:00.087 2 DEBUG oslo_concurrency.lockutils [req-68f53ae3-35eb-4e7a-8dc1-0d523e1fbce0 req-c6357aac-fbfc-45de-a763-010929ffbf7d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-ed5ab92d-5355-4703-8afc-71d5bea99132" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:08:00 compute-0 nova_compute[192903]: 2025-10-06 14:08:00.088 2 DEBUG nova.network.neutron [req-68f53ae3-35eb-4e7a-8dc1-0d523e1fbce0 req-c6357aac-fbfc-45de-a763-010929ffbf7d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Refreshing network info cache for port b40989b0-f6eb-4f13-8c80-7c66fdbc387a _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:08:00 compute-0 nova_compute[192903]: 2025-10-06 14:08:00.542 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "refresh_cache-ed5ab92d-5355-4703-8afc-71d5bea99132" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:08:00 compute-0 nova_compute[192903]: 2025-10-06 14:08:00.595 2 WARNING neutronclient.v2_0.client [req-68f53ae3-35eb-4e7a-8dc1-0d523e1fbce0 req-c6357aac-fbfc-45de-a763-010929ffbf7d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:08:00 compute-0 nova_compute[192903]: 2025-10-06 14:08:00.845 2 DEBUG nova.network.neutron [req-68f53ae3-35eb-4e7a-8dc1-0d523e1fbce0 req-c6357aac-fbfc-45de-a763-010929ffbf7d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:08:01 compute-0 nova_compute[192903]: 2025-10-06 14:08:01.001 2 DEBUG nova.network.neutron [req-68f53ae3-35eb-4e7a-8dc1-0d523e1fbce0 req-c6357aac-fbfc-45de-a763-010929ffbf7d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:08:01 compute-0 podman[219314]: 2025-10-06 14:08:01.202320732 +0000 UTC m=+0.068265524 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:08:01 compute-0 nova_compute[192903]: 2025-10-06 14:08:01.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:01 compute-0 openstack_network_exporter[205500]: ERROR   14:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:08:01 compute-0 openstack_network_exporter[205500]: ERROR   14:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:08:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:08:01 compute-0 openstack_network_exporter[205500]: ERROR   14:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:08:01 compute-0 openstack_network_exporter[205500]: ERROR   14:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:08:01 compute-0 openstack_network_exporter[205500]: ERROR   14:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:08:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:08:01 compute-0 nova_compute[192903]: 2025-10-06 14:08:01.508 2 DEBUG oslo_concurrency.lockutils [req-68f53ae3-35eb-4e7a-8dc1-0d523e1fbce0 req-c6357aac-fbfc-45de-a763-010929ffbf7d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-ed5ab92d-5355-4703-8afc-71d5bea99132" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:08:01 compute-0 nova_compute[192903]: 2025-10-06 14:08:01.509 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquired lock "refresh_cache-ed5ab92d-5355-4703-8afc-71d5bea99132" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:08:01 compute-0 nova_compute[192903]: 2025-10-06 14:08:01.510 2 DEBUG nova.network.neutron [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:08:02 compute-0 nova_compute[192903]: 2025-10-06 14:08:02.843 2 DEBUG nova.network.neutron [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:08:03 compute-0 nova_compute[192903]: 2025-10-06 14:08:03.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:03 compute-0 nova_compute[192903]: 2025-10-06 14:08:03.828 2 WARNING neutronclient.v2_0.client [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:08:03 compute-0 podman[219335]: 2025-10-06 14:08:03.963474166 +0000 UTC m=+0.096169493 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.034 2 DEBUG nova.network.neutron [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Updating instance_info_cache with network_info: [{"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.543 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Releasing lock "refresh_cache-ed5ab92d-5355-4703-8afc-71d5bea99132" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.544 2 DEBUG nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Instance network_info: |[{"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.548 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Start _get_guest_xml network_info=[{"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.553 2 WARNING nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.555 2 DEBUG nova.virt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-1152128813', uuid='ed5ab92d-5355-4703-8afc-71d5bea99132'), owner=OwnerMeta(userid='d9a309fbe58c4b158f4fb1f5a9ae1216', username='tempest-TestExecuteBasicStrategy-282608784-project-admin', projectid='5755f5f126624f6b82371d76f860b4cc', projectname='tempest-TestExecuteBasicStrategy-282608784'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759759684.5554655) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.560 2 DEBUG nova.virt.libvirt.host [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.561 2 DEBUG nova.virt.libvirt.host [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.563 2 DEBUG nova.virt.libvirt.host [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.564 2 DEBUG nova.virt.libvirt.host [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.564 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.565 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.565 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.566 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.566 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.567 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.567 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.568 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.569 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.569 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.569 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.570 2 DEBUG nova.virt.hardware [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.576 2 DEBUG nova.virt.libvirt.vif [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:07:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1152128813',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1152128813',id=11,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5755f5f126624f6b82371d76f860b4cc',ramdisk_id='',reservation_id='r-f8kojz0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-282608784',owner_user_name='tempest-TestExecuteBasicStrategy-282608784-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:07:58Z,user_data=None,user_id='d9a309fbe58c4b158f4fb1f5a9ae1216',uuid=ed5ab92d-5355-4703-8afc-71d5bea99132,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.577 2 DEBUG nova.network.os_vif_util [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Converting VIF {"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.578 2 DEBUG nova.network.os_vif_util [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:6f:28,bridge_name='br-int',has_traffic_filtering=True,id=b40989b0-f6eb-4f13-8c80-7c66fdbc387a,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40989b0-f6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:08:04 compute-0 nova_compute[192903]: 2025-10-06 14:08:04.580 2 DEBUG nova.objects.instance [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lazy-loading 'pci_devices' on Instance uuid ed5ab92d-5355-4703-8afc-71d5bea99132 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.091 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <uuid>ed5ab92d-5355-4703-8afc-71d5bea99132</uuid>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <name>instance-0000000b</name>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1152128813</nova:name>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:08:04</nova:creationTime>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:08:05 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:08:05 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:user uuid="d9a309fbe58c4b158f4fb1f5a9ae1216">tempest-TestExecuteBasicStrategy-282608784-project-admin</nova:user>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:project uuid="5755f5f126624f6b82371d76f860b4cc">tempest-TestExecuteBasicStrategy-282608784</nova:project>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         <nova:port uuid="b40989b0-f6eb-4f13-8c80-7c66fdbc387a">
Oct 06 14:08:05 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <system>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <entry name="serial">ed5ab92d-5355-4703-8afc-71d5bea99132</entry>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <entry name="uuid">ed5ab92d-5355-4703-8afc-71d5bea99132</entry>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     </system>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <os>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   </os>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <features>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   </features>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk.config"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:a2:6f:28"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <target dev="tapb40989b0-f6"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/console.log" append="off"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <video>
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     </video>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:08:05 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:08:05 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:08:05 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:08:05 compute-0 nova_compute[192903]: </domain>
Oct 06 14:08:05 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.093 2 DEBUG nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Preparing to wait for external event network-vif-plugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.094 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.094 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.094 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.095 2 DEBUG nova.virt.libvirt.vif [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:07:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1152128813',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1152128813',id=11,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5755f5f126624f6b82371d76f860b4cc',ramdisk_id='',reservation_id='r-f8kojz0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-282608784',owner_user_name='tempest-TestExecuteBasicStrategy-282608784-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:07:58Z,user_data=None,user_id='d9a309fbe58c4b158f4fb1f5a9ae1216',uuid=ed5ab92d-5355-4703-8afc-71d5bea99132,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.095 2 DEBUG nova.network.os_vif_util [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Converting VIF {"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.096 2 DEBUG nova.network.os_vif_util [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:6f:28,bridge_name='br-int',has_traffic_filtering=True,id=b40989b0-f6eb-4f13-8c80-7c66fdbc387a,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40989b0-f6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.096 2 DEBUG os_vif [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:6f:28,bridge_name='br-int',has_traffic_filtering=True,id=b40989b0-f6eb-4f13-8c80-7c66fdbc387a,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40989b0-f6') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.097 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.098 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.099 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '82b77faa-2ab8-5fab-98de-f6d34b88bb65', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.142 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb40989b0-f6, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.143 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb40989b0-f6, col_values=(('qos', UUID('ff4be241-bdcd-4043-842b-0c381e30ced0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.143 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb40989b0-f6, col_values=(('external_ids', {'iface-id': 'b40989b0-f6eb-4f13-8c80-7c66fdbc387a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:6f:28', 'vm-uuid': 'ed5ab92d-5355-4703-8afc-71d5bea99132'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:05 compute-0 NetworkManager[52035]: <info>  [1759759685.1457] manager: (tapb40989b0-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:05 compute-0 nova_compute[192903]: 2025-10-06 14:08:05.153 2 INFO os_vif [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:6f:28,bridge_name='br-int',has_traffic_filtering=True,id=b40989b0-f6eb-4f13-8c80-7c66fdbc387a,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40989b0-f6')
Oct 06 14:08:06 compute-0 nova_compute[192903]: 2025-10-06 14:08:06.732 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:08:06 compute-0 nova_compute[192903]: 2025-10-06 14:08:06.733 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:08:06 compute-0 nova_compute[192903]: 2025-10-06 14:08:06.734 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] No VIF found with MAC fa:16:3e:a2:6f:28, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:08:06 compute-0 nova_compute[192903]: 2025-10-06 14:08:06.735 2 INFO nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Using config drive
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.250 2 WARNING neutronclient.v2_0.client [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.379 2 INFO nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Creating config drive at /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk.config
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.386 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp33yb0wjq execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.517 2 DEBUG oslo_concurrency.processutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp33yb0wjq" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:07 compute-0 kernel: tapb40989b0-f6: entered promiscuous mode
Oct 06 14:08:07 compute-0 ovn_controller[95205]: 2025-10-06T14:08:07Z|00088|binding|INFO|Claiming lport b40989b0-f6eb-4f13-8c80-7c66fdbc387a for this chassis.
Oct 06 14:08:07 compute-0 ovn_controller[95205]: 2025-10-06T14:08:07Z|00089|binding|INFO|b40989b0-f6eb-4f13-8c80-7c66fdbc387a: Claiming fa:16:3e:a2:6f:28 10.100.0.10
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:07 compute-0 NetworkManager[52035]: <info>  [1759759687.6211] manager: (tapb40989b0-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:07 compute-0 systemd-udevd[219376]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.663 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:6f:28 10.100.0.10'], port_security=['fa:16:3e:a2:6f:28 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ed5ab92d-5355-4703-8afc-71d5bea99132', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5755f5f126624f6b82371d76f860b4cc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '72b17cd8-5f00-46da-a023-10d2a647ccff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb873b11-6bc3-4cba-8b83-39f2042a0d3f, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=b40989b0-f6eb-4f13-8c80-7c66fdbc387a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.664 104072 INFO neutron.agent.ovn.metadata.agent [-] Port b40989b0-f6eb-4f13-8c80-7c66fdbc387a in datapath f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 bound to our chassis
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.666 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774
Oct 06 14:08:07 compute-0 systemd-machined[152985]: New machine qemu-7-instance-0000000b.
Oct 06 14:08:07 compute-0 NetworkManager[52035]: <info>  [1759759687.6843] device (tapb40989b0-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:08:07 compute-0 NetworkManager[52035]: <info>  [1759759687.6853] device (tapb40989b0-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.684 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2b71668b-e5d8-46e4-a770-20d103fae937]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.685 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2d0b7b9-f1 in ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.687 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2d0b7b9-f0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.688 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c5769c43-a78a-4ae8-88d2-12912dc46cfe]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.689 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f2b880-62c8-48af-ad7e-2318fc8f33db]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.707 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[9362ad30-806c-492b-837c-761ca994a063]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.715 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[20cbe6bb-d95c-4c8f-baf5-c7b84e9ebd5a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 ovn_controller[95205]: 2025-10-06T14:08:07Z|00090|binding|INFO|Setting lport b40989b0-f6eb-4f13-8c80-7c66fdbc387a ovn-installed in OVS
Oct 06 14:08:07 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000b.
Oct 06 14:08:07 compute-0 ovn_controller[95205]: 2025-10-06T14:08:07Z|00091|binding|INFO|Setting lport b40989b0-f6eb-4f13-8c80-7c66fdbc387a up in Southbound
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.754 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[a200dd2a-9e18-47a0-a0ab-98ca762e8381]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.761 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8d5bef-f856-46d2-af3c-6a79ce1d4cd4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 NetworkManager[52035]: <info>  [1759759687.7629] manager: (tapf2d0b7b9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.797 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa65a66-93f0-4104-9861-f3a40445dd28]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.801 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb48c73-0f1b-445e-a251-a0059e21adac]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 NetworkManager[52035]: <info>  [1759759687.8318] device (tapf2d0b7b9-f0): carrier: link connected
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.838 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fa38ed-f8b6-4d94-8dd2-250756feb65c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.860 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[13b28f96-e698-4ac5-af2b-3e9bd04caa2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2d0b7b9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:25:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414947, 'reachable_time': 29322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219411, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.881 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c50eda8c-c259-480b-a4a0-d2c115456d3c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feab:2516'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414947, 'tstamp': 414947}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219412, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.902 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[982466e2-4f73-49c4-aa2d-27c7c8474934]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2d0b7b9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:25:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414947, 'reachable_time': 29322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219413, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.944 2 DEBUG nova.compute.manager [req-5d691c8c-f806-4a01-80e9-7f50f1e39dfd req-4ca17517-f6a4-423c-a5fe-8dacc307144f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Received event network-vif-plugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.946 2 DEBUG oslo_concurrency.lockutils [req-5d691c8c-f806-4a01-80e9-7f50f1e39dfd req-4ca17517-f6a4-423c-a5fe-8dacc307144f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.947 2 DEBUG oslo_concurrency.lockutils [req-5d691c8c-f806-4a01-80e9-7f50f1e39dfd req-4ca17517-f6a4-423c-a5fe-8dacc307144f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.947 2 DEBUG oslo_concurrency.lockutils [req-5d691c8c-f806-4a01-80e9-7f50f1e39dfd req-4ca17517-f6a4-423c-a5fe-8dacc307144f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:08:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:07.947 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5e789c96-4fb2-4130-bf48-5e3a90469180]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:07 compute-0 nova_compute[192903]: 2025-10-06 14:08:07.948 2 DEBUG nova.compute.manager [req-5d691c8c-f806-4a01-80e9-7f50f1e39dfd req-4ca17517-f6a4-423c-a5fe-8dacc307144f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Processing event network-vif-plugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.026 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e37f8e-e464-4f12-9f63-91c87eb3b99a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.027 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2d0b7b9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.028 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.028 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2d0b7b9-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:08 compute-0 nova_compute[192903]: 2025-10-06 14:08:08.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:08 compute-0 kernel: tapf2d0b7b9-f0: entered promiscuous mode
Oct 06 14:08:08 compute-0 NetworkManager[52035]: <info>  [1759759688.0314] manager: (tapf2d0b7b9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Oct 06 14:08:08 compute-0 nova_compute[192903]: 2025-10-06 14:08:08.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.034 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2d0b7b9-f0, col_values=(('external_ids', {'iface-id': '1c2f1045-4132-4ec7-9b93-34567076228a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:08 compute-0 nova_compute[192903]: 2025-10-06 14:08:08.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:08 compute-0 ovn_controller[95205]: 2025-10-06T14:08:08Z|00092|binding|INFO|Releasing lport 1c2f1045-4132-4ec7-9b93-34567076228a from this chassis (sb_readonly=0)
Oct 06 14:08:08 compute-0 nova_compute[192903]: 2025-10-06 14:08:08.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.070 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[58c44694-987c-478e-8f73-85e0c7cb8202]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.070 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.071 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.071 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.071 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.071 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[94e20fbd-9f36-4765-92a5-1ac329c89d59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.072 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.074 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2708bd3e-d83a-483d-98d6-51ec81a667f2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.074 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:08:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:08.075 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'env', 'PROCESS_TAG=haproxy-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:08:08 compute-0 podman[219447]: 2025-10-06 14:08:08.505397154 +0000 UTC m=+0.062418866 container create 789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 06 14:08:08 compute-0 systemd[1]: Started libpod-conmon-789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129.scope.
Oct 06 14:08:08 compute-0 podman[219447]: 2025-10-06 14:08:08.473356121 +0000 UTC m=+0.030377813 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:08:08 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:08:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4f407bcb1e37e95fd531cf1a9fa6d24938c97d90e4391ae28a875cf035a5bed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:08:08 compute-0 nova_compute[192903]: 2025-10-06 14:08:08.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:08 compute-0 podman[219447]: 2025-10-06 14:08:08.662500894 +0000 UTC m=+0.219522636 container init 789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:08:08 compute-0 podman[219447]: 2025-10-06 14:08:08.67020791 +0000 UTC m=+0.227229642 container start 789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930)
Oct 06 14:08:08 compute-0 neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774[219462]: [NOTICE]   (219466) : New worker (219468) forked
Oct 06 14:08:08 compute-0 neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774[219462]: [NOTICE]   (219466) : Loading success.
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.373 2 DEBUG nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.380 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.383 2 INFO nova.virt.libvirt.driver [-] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Instance spawned successfully.
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.383 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.900 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.901 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.902 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.902 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.903 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:08:09 compute-0 nova_compute[192903]: 2025-10-06 14:08:09.904 2 DEBUG nova.virt.libvirt.driver [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.024 2 DEBUG nova.compute.manager [req-d34be8ec-ebcd-4012-b8cc-9e2d1f46904d req-9d9be8ed-f834-4065-95f8-be05c19e3a0e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Received event network-vif-plugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.025 2 DEBUG oslo_concurrency.lockutils [req-d34be8ec-ebcd-4012-b8cc-9e2d1f46904d req-9d9be8ed-f834-4065-95f8-be05c19e3a0e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.025 2 DEBUG oslo_concurrency.lockutils [req-d34be8ec-ebcd-4012-b8cc-9e2d1f46904d req-9d9be8ed-f834-4065-95f8-be05c19e3a0e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.025 2 DEBUG oslo_concurrency.lockutils [req-d34be8ec-ebcd-4012-b8cc-9e2d1f46904d req-9d9be8ed-f834-4065-95f8-be05c19e3a0e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.026 2 DEBUG nova.compute.manager [req-d34be8ec-ebcd-4012-b8cc-9e2d1f46904d req-9d9be8ed-f834-4065-95f8-be05c19e3a0e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] No waiting events found dispatching network-vif-plugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.026 2 WARNING nova.compute.manager [req-d34be8ec-ebcd-4012-b8cc-9e2d1f46904d req-9d9be8ed-f834-4065-95f8-be05c19e3a0e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Received unexpected event network-vif-plugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a for instance with vm_state building and task_state spawning.
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.417 2 INFO nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Took 10.89 seconds to spawn the instance on the hypervisor.
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.418 2 DEBUG nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:08:10 compute-0 nova_compute[192903]: 2025-10-06 14:08:10.957 2 INFO nova.compute.manager [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Took 17.38 seconds to build instance.
Oct 06 14:08:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:11.363 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:08:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:11.363 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:08:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:11.363 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:08:11 compute-0 nova_compute[192903]: 2025-10-06 14:08:11.463 2 DEBUG oslo_concurrency.lockutils [None req-1a04bf36-fda0-40c4-af58-3b66ca243c5b d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.895s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:08:13 compute-0 nova_compute[192903]: 2025-10-06 14:08:13.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:15 compute-0 nova_compute[192903]: 2025-10-06 14:08:15.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:18 compute-0 nova_compute[192903]: 2025-10-06 14:08:18.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:19 compute-0 podman[219485]: 2025-10-06 14:08:19.233770247 +0000 UTC m=+0.091196997 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:08:20 compute-0 nova_compute[192903]: 2025-10-06 14:08:20.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:20 compute-0 ovn_controller[95205]: 2025-10-06T14:08:20Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a2:6f:28 10.100.0.10
Oct 06 14:08:20 compute-0 ovn_controller[95205]: 2025-10-06T14:08:20Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:6f:28 10.100.0.10
Oct 06 14:08:22 compute-0 podman[219529]: 2025-10-06 14:08:22.217787241 +0000 UTC m=+0.068068240 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4)
Oct 06 14:08:22 compute-0 podman[219530]: 2025-10-06 14:08:22.233645463 +0000 UTC m=+0.080703170 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 06 14:08:22 compute-0 podman[219528]: 2025-10-06 14:08:22.249143757 +0000 UTC m=+0.099933029 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930)
Oct 06 14:08:23 compute-0 nova_compute[192903]: 2025-10-06 14:08:23.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:25 compute-0 nova_compute[192903]: 2025-10-06 14:08:25.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:28 compute-0 nova_compute[192903]: 2025-10-06 14:08:28.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:29 compute-0 podman[203308]: time="2025-10-06T14:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:08:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:08:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3477 "" "Go-http-client/1.1"
Oct 06 14:08:30 compute-0 nova_compute[192903]: 2025-10-06 14:08:30.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:31 compute-0 openstack_network_exporter[205500]: ERROR   14:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:08:31 compute-0 openstack_network_exporter[205500]: ERROR   14:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:08:31 compute-0 openstack_network_exporter[205500]: ERROR   14:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:08:31 compute-0 openstack_network_exporter[205500]: ERROR   14:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:08:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:08:31 compute-0 openstack_network_exporter[205500]: ERROR   14:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:08:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:08:32 compute-0 podman[219594]: 2025-10-06 14:08:32.229299028 +0000 UTC m=+0.090955291 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2)
Oct 06 14:08:33 compute-0 nova_compute[192903]: 2025-10-06 14:08:33.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:34 compute-0 podman[219614]: 2025-10-06 14:08:34.228169723 +0000 UTC m=+0.080254439 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Oct 06 14:08:35 compute-0 nova_compute[192903]: 2025-10-06 14:08:35.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:38 compute-0 ovn_controller[95205]: 2025-10-06T14:08:38Z|00093|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 06 14:08:38 compute-0 nova_compute[192903]: 2025-10-06 14:08:38.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:39 compute-0 nova_compute[192903]: 2025-10-06 14:08:39.549 2 DEBUG nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Creating tmpfile /var/lib/nova/instances/tmplahrcht0 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:08:39 compute-0 nova_compute[192903]: 2025-10-06 14:08:39.550 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:08:39 compute-0 nova_compute[192903]: 2025-10-06 14:08:39.567 2 DEBUG nova.compute.manager [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplahrcht0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:08:39 compute-0 nova_compute[192903]: 2025-10-06 14:08:39.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:08:39 compute-0 nova_compute[192903]: 2025-10-06 14:08:39.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 14:08:40 compute-0 nova_compute[192903]: 2025-10-06 14:08:40.089 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 14:08:40 compute-0 nova_compute[192903]: 2025-10-06 14:08:40.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:41 compute-0 nova_compute[192903]: 2025-10-06 14:08:41.608 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:08:43 compute-0 nova_compute[192903]: 2025-10-06 14:08:43.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:45 compute-0 nova_compute[192903]: 2025-10-06 14:08:45.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:45 compute-0 nova_compute[192903]: 2025-10-06 14:08:45.706 2 DEBUG nova.compute.manager [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplahrcht0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b34ea327-38fd-4e84-968c-3a397515944b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:08:46 compute-0 nova_compute[192903]: 2025-10-06 14:08:46.725 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-b34ea327-38fd-4e84-968c-3a397515944b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:08:46 compute-0 nova_compute[192903]: 2025-10-06 14:08:46.725 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-b34ea327-38fd-4e84-968c-3a397515944b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:08:46 compute-0 nova_compute[192903]: 2025-10-06 14:08:46.726 2 DEBUG nova.network.neutron [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:08:47 compute-0 nova_compute[192903]: 2025-10-06 14:08:47.232 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:08:47 compute-0 nova_compute[192903]: 2025-10-06 14:08:47.768 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:08:47 compute-0 nova_compute[192903]: 2025-10-06 14:08:47.951 2 DEBUG nova.network.neutron [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Updating instance_info_cache with network_info: [{"id": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "address": "fa:16:3e:aa:42:52", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c79775-4f", "ovs_interfaceid": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.457 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-b34ea327-38fd-4e84-968c-3a397515944b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.472 2 DEBUG nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplahrcht0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b34ea327-38fd-4e84-968c-3a397515944b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.473 2 DEBUG nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Creating instance directory: /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.474 2 DEBUG nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Creating disk.info with the contents: {'/var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk': 'qcow2', '/var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.474 2 DEBUG nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.475 2 DEBUG nova.objects.instance [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid b34ea327-38fd-4e84-968c-3a397515944b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.981 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.988 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:08:48 compute-0 nova_compute[192903]: 2025-10-06 14:08:48.990 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.081 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.082 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.083 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.084 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.091 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.092 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.103 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.104 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.164 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.165 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.204 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.206 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.206 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.261 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.262 2 DEBUG nova.virt.disk.api [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.263 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.317 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.319 2 DEBUG nova.virt.disk.api [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.320 2 DEBUG nova.objects.instance [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid b34ea327-38fd-4e84-968c-3a397515944b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.827 2 DEBUG nova.objects.base [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<b34ea327-38fd-4e84-968c-3a397515944b> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.828 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.864 2 DEBUG oslo_concurrency.processutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b/disk.config 497664" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.865 2 DEBUG nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.868 2 DEBUG nova.virt.libvirt.vif [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:07:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-216104063',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-216104063',id=10,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:07:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5755f5f126624f6b82371d76f860b4cc',ramdisk_id='',reservation_id='r-jx3im0lq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-282608784',owner_user_name='tempest-TestExecuteBasicStrategy-282608784-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:07:46Z,user_data=None,user_id='d9a309fbe58c4b158f4fb1f5a9ae1216',uuid=b34ea327-38fd-4e84-968c-3a397515944b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "address": "fa:16:3e:aa:42:52", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap39c79775-4f", "ovs_interfaceid": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.869 2 DEBUG nova.network.os_vif_util [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "address": "fa:16:3e:aa:42:52", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap39c79775-4f", "ovs_interfaceid": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.870 2 DEBUG nova.network.os_vif_util [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:42:52,bridge_name='br-int',has_traffic_filtering=True,id=39c79775-4fd4-4b0c-b15a-2a0b8560ac02,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c79775-4f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.871 2 DEBUG os_vif [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:42:52,bridge_name='br-int',has_traffic_filtering=True,id=39c79775-4fd4-4b0c-b15a-2a0b8560ac02,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c79775-4f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.873 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.874 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.875 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c8fd0416-2d83-5080-a998-1e52f3945ad1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.884 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39c79775-4f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.884 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap39c79775-4f, col_values=(('qos', UUID('cf1bbcd6-973c-4d12-a89c-9d5b98a3fd03')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.885 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap39c79775-4f, col_values=(('external_ids', {'iface-id': '39c79775-4fd4-4b0c-b15a-2a0b8560ac02', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:aa:42:52', 'vm-uuid': 'b34ea327-38fd-4e84-968c-3a397515944b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:49 compute-0 NetworkManager[52035]: <info>  [1759759729.8879] manager: (tap39c79775-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.897 2 INFO os_vif [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:42:52,bridge_name='br-int',has_traffic_filtering=True,id=39c79775-4fd4-4b0c-b15a-2a0b8560ac02,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c79775-4f')
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.898 2 DEBUG nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.898 2 DEBUG nova.compute.manager [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplahrcht0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b34ea327-38fd-4e84-968c-3a397515944b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:08:49 compute-0 nova_compute[192903]: 2025-10-06 14:08:49.900 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:08:50 compute-0 podman[219658]: 2025-10-06 14:08:50.200355711 +0000 UTC m=+0.059963254 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:08:50 compute-0 nova_compute[192903]: 2025-10-06 14:08:50.889 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:08:51 compute-0 nova_compute[192903]: 2025-10-06 14:08:51.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:08:52 compute-0 nova_compute[192903]: 2025-10-06 14:08:52.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:08:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:52.851 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:08:52 compute-0 nova_compute[192903]: 2025-10-06 14:08:52.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:52.852 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:08:53 compute-0 nova_compute[192903]: 2025-10-06 14:08:53.089 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:08:53 compute-0 podman[219683]: 2025-10-06 14:08:53.217579617 +0000 UTC m=+0.081306915 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:08:53 compute-0 podman[219685]: 2025-10-06 14:08:53.231312868 +0000 UTC m=+0.083860274 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:08:53 compute-0 podman[219684]: 2025-10-06 14:08:53.232014525 +0000 UTC m=+0.091939964 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 14:08:53 compute-0 nova_compute[192903]: 2025-10-06 14:08:53.601 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:08:53 compute-0 nova_compute[192903]: 2025-10-06 14:08:53.601 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:08:53 compute-0 nova_compute[192903]: 2025-10-06 14:08:53.601 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:08:53 compute-0 nova_compute[192903]: 2025-10-06 14:08:53.602 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:08:53 compute-0 nova_compute[192903]: 2025-10-06 14:08:53.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:53.855 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.281 2 DEBUG nova.network.neutron [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Port 39c79775-4fd4-4b0c-b15a-2a0b8560ac02 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.294 2 DEBUG nova.compute.manager [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplahrcht0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b34ea327-38fd-4e84-968c-3a397515944b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.641 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.729 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.730 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.793 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.938 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.939 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.955 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.955 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5685MB free_disk=73.27617263793945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.955 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:08:54 compute-0 nova_compute[192903]: 2025-10-06 14:08:54.956 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:08:55 compute-0 nova_compute[192903]: 2025-10-06 14:08:55.971 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Migration for instance b34ea327-38fd-4e84-968c-3a397515944b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 06 14:08:56 compute-0 nova_compute[192903]: 2025-10-06 14:08:56.479 2 INFO nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Updating resource usage from migration 1f89b3ca-5f9e-427d-8e77-37105d8494ec
Oct 06 14:08:56 compute-0 nova_compute[192903]: 2025-10-06 14:08:56.480 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Starting to track incoming migration 1f89b3ca-5f9e-427d-8e77-37105d8494ec with flavor 8cb06c85-e9e7-417f-906b-1f7cf29f7de9 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 06 14:08:56 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 06 14:08:56 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.019 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance ed5ab92d-5355-4703-8afc-71d5bea99132 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:08:57 compute-0 kernel: tap39c79775-4f: entered promiscuous mode
Oct 06 14:08:57 compute-0 NetworkManager[52035]: <info>  [1759759737.1311] manager: (tap39c79775-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Oct 06 14:08:57 compute-0 ovn_controller[95205]: 2025-10-06T14:08:57Z|00094|binding|INFO|Claiming lport 39c79775-4fd4-4b0c-b15a-2a0b8560ac02 for this additional chassis.
Oct 06 14:08:57 compute-0 ovn_controller[95205]: 2025-10-06T14:08:57Z|00095|binding|INFO|39c79775-4fd4-4b0c-b15a-2a0b8560ac02: Claiming fa:16:3e:aa:42:52 10.100.0.5
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.142 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:42:52 10.100.0.5'], port_security=['fa:16:3e:aa:42:52 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b34ea327-38fd-4e84-968c-3a397515944b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5755f5f126624f6b82371d76f860b4cc', 'neutron:revision_number': '10', 'neutron:security_group_ids': '72b17cd8-5f00-46da-a023-10d2a647ccff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb873b11-6bc3-4cba-8b83-39f2042a0d3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=39c79775-4fd4-4b0c-b15a-2a0b8560ac02) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.144 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 39c79775-4fd4-4b0c-b15a-2a0b8560ac02 in datapath f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 unbound from our chassis
Oct 06 14:08:57 compute-0 ovn_controller[95205]: 2025-10-06T14:08:57Z|00096|binding|INFO|Setting lport 39c79775-4fd4-4b0c-b15a-2a0b8560ac02 ovn-installed in OVS
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.145 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.172 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4e736f69-469d-482d-bf08-f8f3d8240aa4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:57 compute-0 systemd-machined[152985]: New machine qemu-8-instance-0000000a.
Oct 06 14:08:57 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000a.
Oct 06 14:08:57 compute-0 systemd-udevd[219791]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.226 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[3e2731a9-6d6d-469e-8a0a-2acc854819e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.229 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[7f65e3a5-a70c-4269-9bd7-55892e31636e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:57 compute-0 NetworkManager[52035]: <info>  [1759759737.2370] device (tap39c79775-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:08:57 compute-0 NetworkManager[52035]: <info>  [1759759737.2396] device (tap39c79775-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.271 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ca04d9-d39a-48e4-8664-8371e1271f68]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.302 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8ffe47c4-0f79-4227-a91c-fa36435718b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2d0b7b9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:25:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414947, 'reachable_time': 29322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219801, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.325 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb2d53d-11d4-40ff-8ea2-9ad586b37102]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf2d0b7b9-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414962, 'tstamp': 414962}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219802, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf2d0b7b9-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414966, 'tstamp': 414966}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219802, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.328 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2d0b7b9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.332 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2d0b7b9-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.333 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.333 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2d0b7b9-f0, col_values=(('external_ids', {'iface-id': '1c2f1045-4132-4ec7-9b93-34567076228a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.333 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:08:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:08:57.336 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[f00e1d65-0703-4b13-b890-68ee9ea36f1d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.528 2 WARNING nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance b34ea327-38fd-4e84-968c-3a397515944b has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.529 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.529 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:08:54 up  1:09,  0 user,  load average: 0.47, 0.48, 0.45\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_5755f5f126624f6b82371d76f860b4cc': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:08:57 compute-0 nova_compute[192903]: 2025-10-06 14:08:57.608 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:08:58 compute-0 nova_compute[192903]: 2025-10-06 14:08:58.114 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:08:58 compute-0 nova_compute[192903]: 2025-10-06 14:08:58.626 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:08:58 compute-0 nova_compute[192903]: 2025-10-06 14:08:58.627 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.671s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:08:58 compute-0 nova_compute[192903]: 2025-10-06 14:08:58.627 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:08:58 compute-0 nova_compute[192903]: 2025-10-06 14:08:58.627 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 14:08:58 compute-0 nova_compute[192903]: 2025-10-06 14:08:58.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:08:59 compute-0 nova_compute[192903]: 2025-10-06 14:08:59.134 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:08:59 compute-0 podman[203308]: time="2025-10-06T14:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:08:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:08:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Oct 06 14:08:59 compute-0 nova_compute[192903]: 2025-10-06 14:08:59.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:00 compute-0 nova_compute[192903]: 2025-10-06 14:09:00.132 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:00 compute-0 nova_compute[192903]: 2025-10-06 14:09:00.132 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:00 compute-0 nova_compute[192903]: 2025-10-06 14:09:00.133 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:00 compute-0 nova_compute[192903]: 2025-10-06 14:09:00.133 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:00 compute-0 nova_compute[192903]: 2025-10-06 14:09:00.133 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:09:01 compute-0 ovn_controller[95205]: 2025-10-06T14:09:01Z|00097|binding|INFO|Claiming lport 39c79775-4fd4-4b0c-b15a-2a0b8560ac02 for this chassis.
Oct 06 14:09:01 compute-0 ovn_controller[95205]: 2025-10-06T14:09:01Z|00098|binding|INFO|39c79775-4fd4-4b0c-b15a-2a0b8560ac02: Claiming fa:16:3e:aa:42:52 10.100.0.5
Oct 06 14:09:01 compute-0 ovn_controller[95205]: 2025-10-06T14:09:01Z|00099|binding|INFO|Setting lport 39c79775-4fd4-4b0c-b15a-2a0b8560ac02 up in Southbound
Oct 06 14:09:01 compute-0 openstack_network_exporter[205500]: ERROR   14:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:09:01 compute-0 openstack_network_exporter[205500]: ERROR   14:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:09:01 compute-0 openstack_network_exporter[205500]: ERROR   14:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:09:01 compute-0 openstack_network_exporter[205500]: ERROR   14:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:09:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:09:01 compute-0 openstack_network_exporter[205500]: ERROR   14:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:09:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:09:02 compute-0 nova_compute[192903]: 2025-10-06 14:09:02.198 2 INFO nova.compute.manager [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Post operation of migration started
Oct 06 14:09:02 compute-0 nova_compute[192903]: 2025-10-06 14:09:02.199 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:02 compute-0 nova_compute[192903]: 2025-10-06 14:09:02.871 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:02 compute-0 nova_compute[192903]: 2025-10-06 14:09:02.872 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:03 compute-0 nova_compute[192903]: 2025-10-06 14:09:03.001 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-b34ea327-38fd-4e84-968c-3a397515944b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:09:03 compute-0 nova_compute[192903]: 2025-10-06 14:09:03.002 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-b34ea327-38fd-4e84-968c-3a397515944b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:09:03 compute-0 nova_compute[192903]: 2025-10-06 14:09:03.003 2 DEBUG nova.network.neutron [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:09:03 compute-0 podman[219826]: 2025-10-06 14:09:03.24110551 +0000 UTC m=+0.095551568 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:09:03 compute-0 nova_compute[192903]: 2025-10-06 14:09:03.512 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:03 compute-0 nova_compute[192903]: 2025-10-06 14:09:03.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:04 compute-0 nova_compute[192903]: 2025-10-06 14:09:04.600 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:04 compute-0 nova_compute[192903]: 2025-10-06 14:09:04.803 2 DEBUG nova.network.neutron [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Updating instance_info_cache with network_info: [{"id": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "address": "fa:16:3e:aa:42:52", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c79775-4f", "ovs_interfaceid": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:09:04 compute-0 nova_compute[192903]: 2025-10-06 14:09:04.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:05 compute-0 podman[219847]: 2025-10-06 14:09:05.236905945 +0000 UTC m=+0.088558455 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Oct 06 14:09:05 compute-0 nova_compute[192903]: 2025-10-06 14:09:05.312 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-b34ea327-38fd-4e84-968c-3a397515944b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:09:07 compute-0 nova_compute[192903]: 2025-10-06 14:09:07.063 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:07 compute-0 nova_compute[192903]: 2025-10-06 14:09:07.064 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:07 compute-0 nova_compute[192903]: 2025-10-06 14:09:07.064 2 DEBUG oslo_concurrency.lockutils [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:07 compute-0 nova_compute[192903]: 2025-10-06 14:09:07.070 2 INFO nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:09:07 compute-0 virtqemud[192802]: Domain id=8 name='instance-0000000a' uuid=b34ea327-38fd-4e84-968c-3a397515944b is tainted: custom-monitor
Oct 06 14:09:08 compute-0 nova_compute[192903]: 2025-10-06 14:09:08.080 2 INFO nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:09:08 compute-0 nova_compute[192903]: 2025-10-06 14:09:08.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:09 compute-0 nova_compute[192903]: 2025-10-06 14:09:09.088 2 INFO nova.virt.libvirt.driver [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:09:09 compute-0 nova_compute[192903]: 2025-10-06 14:09:09.093 2 DEBUG nova.compute.manager [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:09:09 compute-0 nova_compute[192903]: 2025-10-06 14:09:09.603 2 DEBUG nova.objects.instance [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:09:09 compute-0 nova_compute[192903]: 2025-10-06 14:09:09.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:10 compute-0 nova_compute[192903]: 2025-10-06 14:09:10.624 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:10 compute-0 nova_compute[192903]: 2025-10-06 14:09:10.871 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:10 compute-0 nova_compute[192903]: 2025-10-06 14:09:10.872 2 WARNING neutronclient.v2_0.client [None req-a979cf85-34d2-4462-ac82-36a268a591ad f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:11.364 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:11.365 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:11.366 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:13 compute-0 nova_compute[192903]: 2025-10-06 14:09:13.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:14 compute-0 nova_compute[192903]: 2025-10-06 14:09:14.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.218 2 DEBUG oslo_concurrency.lockutils [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "ed5ab92d-5355-4703-8afc-71d5bea99132" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.218 2 DEBUG oslo_concurrency.lockutils [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.219 2 DEBUG oslo_concurrency.lockutils [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.219 2 DEBUG oslo_concurrency.lockutils [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.220 2 DEBUG oslo_concurrency.lockutils [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.237 2 INFO nova.compute.manager [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Terminating instance
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.754 2 DEBUG nova.compute.manager [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:09:15 compute-0 kernel: tapb40989b0-f6 (unregistering): left promiscuous mode
Oct 06 14:09:15 compute-0 NetworkManager[52035]: <info>  [1759759755.7798] device (tapb40989b0-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:15 compute-0 ovn_controller[95205]: 2025-10-06T14:09:15Z|00100|binding|INFO|Releasing lport b40989b0-f6eb-4f13-8c80-7c66fdbc387a from this chassis (sb_readonly=0)
Oct 06 14:09:15 compute-0 ovn_controller[95205]: 2025-10-06T14:09:15Z|00101|binding|INFO|Setting lport b40989b0-f6eb-4f13-8c80-7c66fdbc387a down in Southbound
Oct 06 14:09:15 compute-0 ovn_controller[95205]: 2025-10-06T14:09:15Z|00102|binding|INFO|Removing iface tapb40989b0-f6 ovn-installed in OVS
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.820 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:6f:28 10.100.0.10'], port_security=['fa:16:3e:a2:6f:28 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ed5ab92d-5355-4703-8afc-71d5bea99132', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5755f5f126624f6b82371d76f860b4cc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '72b17cd8-5f00-46da-a023-10d2a647ccff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb873b11-6bc3-4cba-8b83-39f2042a0d3f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=b40989b0-f6eb-4f13-8c80-7c66fdbc387a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.820 104072 INFO neutron.agent.ovn.metadata.agent [-] Port b40989b0-f6eb-4f13-8c80-7c66fdbc387a in datapath f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 unbound from our chassis
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.821 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.836 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[dedd2cda-63fa-4ec6-baad-bf8e5e1916a5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:15 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct 06 14:09:15 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Consumed 16.077s CPU time.
Oct 06 14:09:15 compute-0 systemd-machined[152985]: Machine qemu-7-instance-0000000b terminated.
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.865 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[c24da247-92e2-4d77-95b0-400685680cf7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.868 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f76b2d-8273-4372-83ca-a81456d69588]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.894 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[f48f818b-46c1-463b-b7e0-dbc23e3c303e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.912 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[09350faa-e833-42c9-ad90-d650e21845a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2d0b7b9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:25:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414947, 'reachable_time': 29322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219885, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.928 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[12c4c0a9-4789-4066-b77f-a38670a2d44c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf2d0b7b9-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414962, 'tstamp': 414962}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219886, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf2d0b7b9-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414966, 'tstamp': 414966}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219886, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.930 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2d0b7b9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:15 compute-0 nova_compute[192903]: 2025-10-06 14:09:15.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.937 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2d0b7b9-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.937 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.937 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2d0b7b9-f0, col_values=(('external_ids', {'iface-id': '1c2f1045-4132-4ec7-9b93-34567076228a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.938 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:09:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:15.939 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[29d423d5-d8cb-4797-b858-23199d607aca]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.026 2 INFO nova.virt.libvirt.driver [-] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Instance destroyed successfully.
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.027 2 DEBUG nova.objects.instance [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lazy-loading 'resources' on Instance uuid ed5ab92d-5355-4703-8afc-71d5bea99132 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.383 2 DEBUG nova.compute.manager [req-b41ca436-623a-4a88-b3e2-7a2f92872764 req-636a175a-0eac-4431-a8ad-0a828d19894f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Received event network-vif-unplugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.384 2 DEBUG oslo_concurrency.lockutils [req-b41ca436-623a-4a88-b3e2-7a2f92872764 req-636a175a-0eac-4431-a8ad-0a828d19894f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.384 2 DEBUG oslo_concurrency.lockutils [req-b41ca436-623a-4a88-b3e2-7a2f92872764 req-636a175a-0eac-4431-a8ad-0a828d19894f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.384 2 DEBUG oslo_concurrency.lockutils [req-b41ca436-623a-4a88-b3e2-7a2f92872764 req-636a175a-0eac-4431-a8ad-0a828d19894f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.384 2 DEBUG nova.compute.manager [req-b41ca436-623a-4a88-b3e2-7a2f92872764 req-636a175a-0eac-4431-a8ad-0a828d19894f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] No waiting events found dispatching network-vif-unplugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.385 2 DEBUG nova.compute.manager [req-b41ca436-623a-4a88-b3e2-7a2f92872764 req-636a175a-0eac-4431-a8ad-0a828d19894f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Received event network-vif-unplugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.535 2 DEBUG nova.virt.libvirt.vif [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:07:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1152128813',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1152128813',id=11,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:08:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5755f5f126624f6b82371d76f860b4cc',ramdisk_id='',reservation_id='r-f8kojz0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-282608784',owner_user_name='tempest-TestExecuteBasicStrategy-282608784-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:08:10Z,user_data=None,user_id='d9a309fbe58c4b158f4fb1f5a9ae1216',uuid=ed5ab92d-5355-4703-8afc-71d5bea99132,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.536 2 DEBUG nova.network.os_vif_util [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Converting VIF {"id": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "address": "fa:16:3e:a2:6f:28", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40989b0-f6", "ovs_interfaceid": "b40989b0-f6eb-4f13-8c80-7c66fdbc387a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.537 2 DEBUG nova.network.os_vif_util [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:6f:28,bridge_name='br-int',has_traffic_filtering=True,id=b40989b0-f6eb-4f13-8c80-7c66fdbc387a,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40989b0-f6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.537 2 DEBUG os_vif [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:6f:28,bridge_name='br-int',has_traffic_filtering=True,id=b40989b0-f6eb-4f13-8c80-7c66fdbc387a,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40989b0-f6') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.539 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb40989b0-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.545 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ff4be241-bdcd-4043-842b-0c381e30ced0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.550 2 INFO os_vif [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:6f:28,bridge_name='br-int',has_traffic_filtering=True,id=b40989b0-f6eb-4f13-8c80-7c66fdbc387a,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40989b0-f6')
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.551 2 INFO nova.virt.libvirt.driver [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Deleting instance files /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132_del
Oct 06 14:09:16 compute-0 nova_compute[192903]: 2025-10-06 14:09:16.551 2 INFO nova.virt.libvirt.driver [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Deletion of /var/lib/nova/instances/ed5ab92d-5355-4703-8afc-71d5bea99132_del complete
Oct 06 14:09:17 compute-0 nova_compute[192903]: 2025-10-06 14:09:17.064 2 INFO nova.compute.manager [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 06 14:09:17 compute-0 nova_compute[192903]: 2025-10-06 14:09:17.065 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:09:17 compute-0 nova_compute[192903]: 2025-10-06 14:09:17.066 2 DEBUG nova.compute.manager [-] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:09:17 compute-0 nova_compute[192903]: 2025-10-06 14:09:17.066 2 DEBUG nova.network.neutron [-] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:09:17 compute-0 nova_compute[192903]: 2025-10-06 14:09:17.067 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:17 compute-0 nova_compute[192903]: 2025-10-06 14:09:17.481 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:17 compute-0 nova_compute[192903]: 2025-10-06 14:09:17.947 2 DEBUG nova.compute.manager [req-66c8c8a9-486e-426c-b469-a908b2bf1680 req-23261528-9dde-46f5-9c6b-1aeaf44ed4e0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Received event network-vif-deleted-b40989b0-f6eb-4f13-8c80-7c66fdbc387a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:09:17 compute-0 nova_compute[192903]: 2025-10-06 14:09:17.948 2 INFO nova.compute.manager [req-66c8c8a9-486e-426c-b469-a908b2bf1680 req-23261528-9dde-46f5-9c6b-1aeaf44ed4e0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Neutron deleted interface b40989b0-f6eb-4f13-8c80-7c66fdbc387a; detaching it from the instance and deleting it from the info cache
Oct 06 14:09:17 compute-0 nova_compute[192903]: 2025-10-06 14:09:17.948 2 DEBUG nova.network.neutron [req-66c8c8a9-486e-426c-b469-a908b2bf1680 req-23261528-9dde-46f5-9c6b-1aeaf44ed4e0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.382 2 DEBUG nova.network.neutron [-] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.456 2 DEBUG nova.compute.manager [req-66c8c8a9-486e-426c-b469-a908b2bf1680 req-23261528-9dde-46f5-9c6b-1aeaf44ed4e0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Detach interface failed, port_id=b40989b0-f6eb-4f13-8c80-7c66fdbc387a, reason: Instance ed5ab92d-5355-4703-8afc-71d5bea99132 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.460 2 DEBUG nova.compute.manager [req-0da05dae-014f-423a-9aae-7ab6839f0e83 req-ff9ecc7e-7f42-4b18-84cf-9d7ffebbf5cd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Received event network-vif-unplugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.460 2 DEBUG oslo_concurrency.lockutils [req-0da05dae-014f-423a-9aae-7ab6839f0e83 req-ff9ecc7e-7f42-4b18-84cf-9d7ffebbf5cd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.460 2 DEBUG oslo_concurrency.lockutils [req-0da05dae-014f-423a-9aae-7ab6839f0e83 req-ff9ecc7e-7f42-4b18-84cf-9d7ffebbf5cd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.460 2 DEBUG oslo_concurrency.lockutils [req-0da05dae-014f-423a-9aae-7ab6839f0e83 req-ff9ecc7e-7f42-4b18-84cf-9d7ffebbf5cd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.461 2 DEBUG nova.compute.manager [req-0da05dae-014f-423a-9aae-7ab6839f0e83 req-ff9ecc7e-7f42-4b18-84cf-9d7ffebbf5cd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] No waiting events found dispatching network-vif-unplugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.461 2 DEBUG nova.compute.manager [req-0da05dae-014f-423a-9aae-7ab6839f0e83 req-ff9ecc7e-7f42-4b18-84cf-9d7ffebbf5cd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Received event network-vif-unplugged-b40989b0-f6eb-4f13-8c80-7c66fdbc387a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:18 compute-0 nova_compute[192903]: 2025-10-06 14:09:18.893 2 INFO nova.compute.manager [-] [instance: ed5ab92d-5355-4703-8afc-71d5bea99132] Took 1.83 seconds to deallocate network for instance.
Oct 06 14:09:19 compute-0 nova_compute[192903]: 2025-10-06 14:09:19.413 2 DEBUG oslo_concurrency.lockutils [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:19 compute-0 nova_compute[192903]: 2025-10-06 14:09:19.414 2 DEBUG oslo_concurrency.lockutils [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:19 compute-0 nova_compute[192903]: 2025-10-06 14:09:19.484 2 DEBUG nova.compute.provider_tree [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:09:19 compute-0 nova_compute[192903]: 2025-10-06 14:09:19.994 2 DEBUG nova.scheduler.client.report [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:09:20 compute-0 nova_compute[192903]: 2025-10-06 14:09:20.504 2 DEBUG oslo_concurrency.lockutils [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:20 compute-0 nova_compute[192903]: 2025-10-06 14:09:20.530 2 INFO nova.scheduler.client.report [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Deleted allocations for instance ed5ab92d-5355-4703-8afc-71d5bea99132
Oct 06 14:09:21 compute-0 podman[219905]: 2025-10-06 14:09:21.224778902 +0000 UTC m=+0.079220005 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:09:21 compute-0 nova_compute[192903]: 2025-10-06 14:09:21.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:21 compute-0 nova_compute[192903]: 2025-10-06 14:09:21.559 2 DEBUG oslo_concurrency.lockutils [None req-7d21ce21-48a8-4d65-a9a3-89923a01112e d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "ed5ab92d-5355-4703-8afc-71d5bea99132" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.341s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:22 compute-0 nova_compute[192903]: 2025-10-06 14:09:22.578 2 DEBUG oslo_concurrency.lockutils [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "b34ea327-38fd-4e84-968c-3a397515944b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:22 compute-0 nova_compute[192903]: 2025-10-06 14:09:22.579 2 DEBUG oslo_concurrency.lockutils [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "b34ea327-38fd-4e84-968c-3a397515944b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:22 compute-0 nova_compute[192903]: 2025-10-06 14:09:22.579 2 DEBUG oslo_concurrency.lockutils [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "b34ea327-38fd-4e84-968c-3a397515944b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:22 compute-0 nova_compute[192903]: 2025-10-06 14:09:22.579 2 DEBUG oslo_concurrency.lockutils [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "b34ea327-38fd-4e84-968c-3a397515944b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:22 compute-0 nova_compute[192903]: 2025-10-06 14:09:22.580 2 DEBUG oslo_concurrency.lockutils [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "b34ea327-38fd-4e84-968c-3a397515944b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:22 compute-0 nova_compute[192903]: 2025-10-06 14:09:22.590 2 INFO nova.compute.manager [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Terminating instance
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.108 2 DEBUG nova.compute.manager [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:09:23 compute-0 kernel: tap39c79775-4f (unregistering): left promiscuous mode
Oct 06 14:09:23 compute-0 NetworkManager[52035]: <info>  [1759759763.1520] device (tap39c79775-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:09:23 compute-0 ovn_controller[95205]: 2025-10-06T14:09:23Z|00103|binding|INFO|Releasing lport 39c79775-4fd4-4b0c-b15a-2a0b8560ac02 from this chassis (sb_readonly=0)
Oct 06 14:09:23 compute-0 ovn_controller[95205]: 2025-10-06T14:09:23Z|00104|binding|INFO|Setting lport 39c79775-4fd4-4b0c-b15a-2a0b8560ac02 down in Southbound
Oct 06 14:09:23 compute-0 ovn_controller[95205]: 2025-10-06T14:09:23Z|00105|binding|INFO|Removing iface tap39c79775-4f ovn-installed in OVS
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.227 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:42:52 10.100.0.5'], port_security=['fa:16:3e:aa:42:52 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b34ea327-38fd-4e84-968c-3a397515944b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5755f5f126624f6b82371d76f860b4cc', 'neutron:revision_number': '16', 'neutron:security_group_ids': '72b17cd8-5f00-46da-a023-10d2a647ccff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb873b11-6bc3-4cba-8b83-39f2042a0d3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=39c79775-4fd4-4b0c-b15a-2a0b8560ac02) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.229 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 39c79775-4fd4-4b0c-b15a-2a0b8560ac02 in datapath f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 unbound from our chassis
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.231 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.232 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c8c195b2-fbab-48c6-ab64-20ef18b4004d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.233 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 namespace which is not needed anymore
Oct 06 14:09:23 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Oct 06 14:09:23 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Consumed 2.675s CPU time.
Oct 06 14:09:23 compute-0 systemd-machined[152985]: Machine qemu-8-instance-0000000a terminated.
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 podman[219938]: 2025-10-06 14:09:23.366493844 +0000 UTC m=+0.082848871 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 06 14:09:23 compute-0 podman[219937]: 2025-10-06 14:09:23.375241998 +0000 UTC m=+0.092019485 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.383 2 INFO nova.virt.libvirt.driver [-] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Instance destroyed successfully.
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.384 2 DEBUG nova.objects.instance [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lazy-loading 'resources' on Instance uuid b34ea327-38fd-4e84-968c-3a397515944b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:09:23 compute-0 neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774[219462]: [NOTICE]   (219466) : haproxy version is 3.0.5-8e879a5
Oct 06 14:09:23 compute-0 neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774[219462]: [NOTICE]   (219466) : path to executable is /usr/sbin/haproxy
Oct 06 14:09:23 compute-0 neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774[219462]: [WARNING]  (219466) : Exiting Master process...
Oct 06 14:09:23 compute-0 podman[219984]: 2025-10-06 14:09:23.384750011 +0000 UTC m=+0.045718801 container kill 789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:09:23 compute-0 neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774[219462]: [ALERT]    (219466) : Current worker (219468) exited with code 143 (Terminated)
Oct 06 14:09:23 compute-0 neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774[219462]: [WARNING]  (219466) : All workers exited. Exiting... (0)
Oct 06 14:09:23 compute-0 systemd[1]: libpod-789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129.scope: Deactivated successfully.
Oct 06 14:09:23 compute-0 conmon[219462]: conmon 789a74850df61b32acc2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129.scope/container/memory.events
Oct 06 14:09:23 compute-0 podman[219932]: 2025-10-06 14:09:23.401125474 +0000 UTC m=+0.130063556 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Oct 06 14:09:23 compute-0 podman[220041]: 2025-10-06 14:09:23.434609538 +0000 UTC m=+0.027823862 container died 789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 06 14:09:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129-userdata-shm.mount: Deactivated successfully.
Oct 06 14:09:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-f4f407bcb1e37e95fd531cf1a9fa6d24938c97d90e4391ae28a875cf035a5bed-merged.mount: Deactivated successfully.
Oct 06 14:09:23 compute-0 podman[220041]: 2025-10-06 14:09:23.471258296 +0000 UTC m=+0.064472600 container cleanup 789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:09:23 compute-0 systemd[1]: libpod-conmon-789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129.scope: Deactivated successfully.
Oct 06 14:09:23 compute-0 podman[220043]: 2025-10-06 14:09:23.490397234 +0000 UTC m=+0.071765291 container remove 789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.497 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[de60208d-a656-4c66-81a9-ec2df06f9515]: (4, ("Mon Oct  6 02:09:23 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 (789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129)\n789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129\nMon Oct  6 02:09:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 (789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129)\n789a74850df61b32acc2409c59a084191b9967bdfa090f9fea7d27991e4f2129\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.498 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9a42dbd5-e006-4ce1-a2e4-ffe852025295]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.499 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.500 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6be5eae9-dd6c-4adc-98d3-bec4318b5624]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.501 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2d0b7b9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 kernel: tapf2d0b7b9-f0: left promiscuous mode
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.528 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf8974e-f2a5-41d2-a062-d8fb646d2423]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.549 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2f4d707d-71d5-4c1d-9b1b-9fe13d782d31]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.550 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[daa66bbc-2df3-48f6-8013-2e37c724e951]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.571 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3ff74ae1-b3bf-4d83-9d2d-a2e0536360a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414938, 'reachable_time': 29839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220075, 'error': None, 'target': 'ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.574 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:09:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:23.574 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[08df5d29-63b0-47b0-933b-c5abf24fffc0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:23 compute-0 systemd[1]: run-netns-ovnmeta\x2df2d0b7b9\x2df8b8\x2d4a0b\x2d8ae3\x2d1ea1c0092774.mount: Deactivated successfully.
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.890 2 DEBUG nova.virt.libvirt.vif [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:07:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-216104063',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-216104063',id=10,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:07:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5755f5f126624f6b82371d76f860b4cc',ramdisk_id='',reservation_id='r-jx3im0lq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-282608784',owner_user_name='tempest-TestExecuteBasicStrategy-282608784-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:09:10Z,user_data=None,user_id='d9a309fbe58c4b158f4fb1f5a9ae1216',uuid=b34ea327-38fd-4e84-968c-3a397515944b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "address": "fa:16:3e:aa:42:52", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c79775-4f", "ovs_interfaceid": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.891 2 DEBUG nova.network.os_vif_util [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Converting VIF {"id": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "address": "fa:16:3e:aa:42:52", "network": {"id": "f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-898085814-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37fbf627b5a647e5a616e5d55c765875", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39c79775-4f", "ovs_interfaceid": "39c79775-4fd4-4b0c-b15a-2a0b8560ac02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.892 2 DEBUG nova.network.os_vif_util [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:aa:42:52,bridge_name='br-int',has_traffic_filtering=True,id=39c79775-4fd4-4b0c-b15a-2a0b8560ac02,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c79775-4f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.893 2 DEBUG os_vif [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:42:52,bridge_name='br-int',has_traffic_filtering=True,id=39c79775-4fd4-4b0c-b15a-2a0b8560ac02,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c79775-4f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.895 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39c79775-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.902 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=cf1bbcd6-973c-4d12-a89c-9d5b98a3fd03) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.910 2 INFO os_vif [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:42:52,bridge_name='br-int',has_traffic_filtering=True,id=39c79775-4fd4-4b0c-b15a-2a0b8560ac02,network=Network(f2d0b7b9-f8b8-4a0b-8ae3-1ea1c0092774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39c79775-4f')
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.911 2 INFO nova.virt.libvirt.driver [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Deleting instance files /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b_del
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.912 2 INFO nova.virt.libvirt.driver [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Deletion of /var/lib/nova/instances/b34ea327-38fd-4e84-968c-3a397515944b_del complete
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.953 2 DEBUG nova.compute.manager [req-9d2e467f-5301-48f4-a99b-565279280b8f req-78998f7b-df84-4a69-87a2-53d96f600517 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Received event network-vif-unplugged-39c79775-4fd4-4b0c-b15a-2a0b8560ac02 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.954 2 DEBUG oslo_concurrency.lockutils [req-9d2e467f-5301-48f4-a99b-565279280b8f req-78998f7b-df84-4a69-87a2-53d96f600517 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "b34ea327-38fd-4e84-968c-3a397515944b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.954 2 DEBUG oslo_concurrency.lockutils [req-9d2e467f-5301-48f4-a99b-565279280b8f req-78998f7b-df84-4a69-87a2-53d96f600517 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "b34ea327-38fd-4e84-968c-3a397515944b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.955 2 DEBUG oslo_concurrency.lockutils [req-9d2e467f-5301-48f4-a99b-565279280b8f req-78998f7b-df84-4a69-87a2-53d96f600517 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "b34ea327-38fd-4e84-968c-3a397515944b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.955 2 DEBUG nova.compute.manager [req-9d2e467f-5301-48f4-a99b-565279280b8f req-78998f7b-df84-4a69-87a2-53d96f600517 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] No waiting events found dispatching network-vif-unplugged-39c79775-4fd4-4b0c-b15a-2a0b8560ac02 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:09:23 compute-0 nova_compute[192903]: 2025-10-06 14:09:23.955 2 DEBUG nova.compute.manager [req-9d2e467f-5301-48f4-a99b-565279280b8f req-78998f7b-df84-4a69-87a2-53d96f600517 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Received event network-vif-unplugged-39c79775-4fd4-4b0c-b15a-2a0b8560ac02 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:09:24 compute-0 nova_compute[192903]: 2025-10-06 14:09:24.428 2 INFO nova.compute.manager [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 06 14:09:24 compute-0 nova_compute[192903]: 2025-10-06 14:09:24.428 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:09:24 compute-0 nova_compute[192903]: 2025-10-06 14:09:24.429 2 DEBUG nova.compute.manager [-] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:09:24 compute-0 nova_compute[192903]: 2025-10-06 14:09:24.429 2 DEBUG nova.network.neutron [-] [instance: b34ea327-38fd-4e84-968c-3a397515944b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:09:24 compute-0 nova_compute[192903]: 2025-10-06 14:09:24.430 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:24 compute-0 nova_compute[192903]: 2025-10-06 14:09:24.870 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:09:25 compute-0 nova_compute[192903]: 2025-10-06 14:09:25.128 2 DEBUG nova.compute.manager [req-8c51c7c3-5242-4352-95db-87313c0e97f3 req-5b81c2b2-a4da-4f4a-a846-2d1afa6807f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Received event network-vif-deleted-39c79775-4fd4-4b0c-b15a-2a0b8560ac02 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:09:25 compute-0 nova_compute[192903]: 2025-10-06 14:09:25.128 2 INFO nova.compute.manager [req-8c51c7c3-5242-4352-95db-87313c0e97f3 req-5b81c2b2-a4da-4f4a-a846-2d1afa6807f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Neutron deleted interface 39c79775-4fd4-4b0c-b15a-2a0b8560ac02; detaching it from the instance and deleting it from the info cache
Oct 06 14:09:25 compute-0 nova_compute[192903]: 2025-10-06 14:09:25.129 2 DEBUG nova.network.neutron [req-8c51c7c3-5242-4352-95db-87313c0e97f3 req-5b81c2b2-a4da-4f4a-a846-2d1afa6807f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:09:25 compute-0 nova_compute[192903]: 2025-10-06 14:09:25.588 2 DEBUG nova.network.neutron [-] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:09:25 compute-0 nova_compute[192903]: 2025-10-06 14:09:25.637 2 DEBUG nova.compute.manager [req-8c51c7c3-5242-4352-95db-87313c0e97f3 req-5b81c2b2-a4da-4f4a-a846-2d1afa6807f7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Detach interface failed, port_id=39c79775-4fd4-4b0c-b15a-2a0b8560ac02, reason: Instance b34ea327-38fd-4e84-968c-3a397515944b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.009 2 DEBUG nova.compute.manager [req-6187c9ce-4b5b-4f5e-8f52-0cd2b5cbab49 req-84e15d60-5143-4249-869c-eac1ef957861 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Received event network-vif-unplugged-39c79775-4fd4-4b0c-b15a-2a0b8560ac02 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.010 2 DEBUG oslo_concurrency.lockutils [req-6187c9ce-4b5b-4f5e-8f52-0cd2b5cbab49 req-84e15d60-5143-4249-869c-eac1ef957861 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "b34ea327-38fd-4e84-968c-3a397515944b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.010 2 DEBUG oslo_concurrency.lockutils [req-6187c9ce-4b5b-4f5e-8f52-0cd2b5cbab49 req-84e15d60-5143-4249-869c-eac1ef957861 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "b34ea327-38fd-4e84-968c-3a397515944b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.010 2 DEBUG oslo_concurrency.lockutils [req-6187c9ce-4b5b-4f5e-8f52-0cd2b5cbab49 req-84e15d60-5143-4249-869c-eac1ef957861 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "b34ea327-38fd-4e84-968c-3a397515944b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.011 2 DEBUG nova.compute.manager [req-6187c9ce-4b5b-4f5e-8f52-0cd2b5cbab49 req-84e15d60-5143-4249-869c-eac1ef957861 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] No waiting events found dispatching network-vif-unplugged-39c79775-4fd4-4b0c-b15a-2a0b8560ac02 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.011 2 DEBUG nova.compute.manager [req-6187c9ce-4b5b-4f5e-8f52-0cd2b5cbab49 req-84e15d60-5143-4249-869c-eac1ef957861 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Received event network-vif-unplugged-39c79775-4fd4-4b0c-b15a-2a0b8560ac02 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.094 2 INFO nova.compute.manager [-] [instance: b34ea327-38fd-4e84-968c-3a397515944b] Took 1.66 seconds to deallocate network for instance.
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.610 2 DEBUG oslo_concurrency.lockutils [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.611 2 DEBUG oslo_concurrency.lockutils [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.616 2 DEBUG oslo_concurrency.lockutils [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:26 compute-0 nova_compute[192903]: 2025-10-06 14:09:26.645 2 INFO nova.scheduler.client.report [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Deleted allocations for instance b34ea327-38fd-4e84-968c-3a397515944b
Oct 06 14:09:27 compute-0 nova_compute[192903]: 2025-10-06 14:09:27.665 2 DEBUG oslo_concurrency.lockutils [None req-f1140603-f3ea-4228-a94e-86973e66f40a d9a309fbe58c4b158f4fb1f5a9ae1216 5755f5f126624f6b82371d76f860b4cc - - default default] Lock "b34ea327-38fd-4e84-968c-3a397515944b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:28 compute-0 nova_compute[192903]: 2025-10-06 14:09:28.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:28 compute-0 nova_compute[192903]: 2025-10-06 14:09:28.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:29 compute-0 podman[203308]: time="2025-10-06T14:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:09:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:09:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Oct 06 14:09:31 compute-0 openstack_network_exporter[205500]: ERROR   14:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:09:31 compute-0 openstack_network_exporter[205500]: ERROR   14:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:09:31 compute-0 openstack_network_exporter[205500]: ERROR   14:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:09:31 compute-0 openstack_network_exporter[205500]: ERROR   14:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:09:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:09:31 compute-0 openstack_network_exporter[205500]: ERROR   14:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:09:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:09:31 compute-0 nova_compute[192903]: 2025-10-06 14:09:31.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:33 compute-0 nova_compute[192903]: 2025-10-06 14:09:33.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:33 compute-0 podman[220077]: 2025-10-06 14:09:33.899065152 +0000 UTC m=+0.071521055 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:09:33 compute-0 nova_compute[192903]: 2025-10-06 14:09:33.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:36 compute-0 podman[220097]: 2025-10-06 14:09:36.225859136 +0000 UTC m=+0.081710764 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal)
Oct 06 14:09:36 compute-0 nova_compute[192903]: 2025-10-06 14:09:36.634 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:38 compute-0 nova_compute[192903]: 2025-10-06 14:09:38.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:38 compute-0 nova_compute[192903]: 2025-10-06 14:09:38.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:43 compute-0 nova_compute[192903]: 2025-10-06 14:09:43.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:43 compute-0 nova_compute[192903]: 2025-10-06 14:09:43.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:44.575 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:79:a7 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-97a18ed5-1b5f-463a-8d47-91fd478eb09a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-97a18ed5-1b5f-463a-8d47-91fd478eb09a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c582640d4c940c0bde1e0e5b497e678', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3b56e6c-81d0-45e1-ad91-d45e9af26b75, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=33b47598-864b-4f2f-ae5f-b3aa49d202df) old=Port_Binding(mac=['fa:16:3e:df:79:a7'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-97a18ed5-1b5f-463a-8d47-91fd478eb09a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-97a18ed5-1b5f-463a-8d47-91fd478eb09a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c582640d4c940c0bde1e0e5b497e678', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:09:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:44.576 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 33b47598-864b-4f2f-ae5f-b3aa49d202df in datapath 97a18ed5-1b5f-463a-8d47-91fd478eb09a updated
Oct 06 14:09:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:44.577 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 97a18ed5-1b5f-463a-8d47-91fd478eb09a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:09:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:44.578 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[0f68f396-9299-40b1-86fd-9a44b039f0d4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:48 compute-0 nova_compute[192903]: 2025-10-06 14:09:48.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:48 compute-0 nova_compute[192903]: 2025-10-06 14:09:48.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:50 compute-0 nova_compute[192903]: 2025-10-06 14:09:50.095 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:50 compute-0 nova_compute[192903]: 2025-10-06 14:09:50.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:51 compute-0 nova_compute[192903]: 2025-10-06 14:09:51.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:52 compute-0 podman[220118]: 2025-10-06 14:09:52.19826836 +0000 UTC m=+0.061462400 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:09:52 compute-0 nova_compute[192903]: 2025-10-06 14:09:52.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:52 compute-0 nova_compute[192903]: 2025-10-06 14:09:52.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:52.943 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:09:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:52.945 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.102 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.102 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.102 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.285 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.286 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.309 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.309 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5855MB free_disk=73.3059310913086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.310 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.310 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:53 compute-0 nova_compute[192903]: 2025-10-06 14:09:53.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:54 compute-0 podman[220146]: 2025-10-06 14:09:54.209328402 +0000 UTC m=+0.066576690 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930)
Oct 06 14:09:54 compute-0 podman[220147]: 2025-10-06 14:09:54.219029749 +0000 UTC m=+0.074366572 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Oct 06 14:09:54 compute-0 podman[220145]: 2025-10-06 14:09:54.277838406 +0000 UTC m=+0.130939927 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 06 14:09:54 compute-0 nova_compute[192903]: 2025-10-06 14:09:54.571 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:09:54 compute-0 nova_compute[192903]: 2025-10-06 14:09:54.571 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:09:53 up  1:10,  0 user,  load average: 0.22, 0.41, 0.43\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:09:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:54.587 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:79:a5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8cda85d1-107e-4e78-8a2a-03fbeb4bf6c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cda85d1-107e-4e78-8a2a-03fbeb4bf6c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69bb104c97ff4b3a8d2316c7a04ba38d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b79ec99a-1e42-478f-a88f-bd1beccec4ee, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=121ee12f-5d3a-4678-8f2d-a6639f6a63a3) old=Port_Binding(mac=['fa:16:3e:f1:79:a5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8cda85d1-107e-4e78-8a2a-03fbeb4bf6c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cda85d1-107e-4e78-8a2a-03fbeb4bf6c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '69bb104c97ff4b3a8d2316c7a04ba38d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:09:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:54.588 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 121ee12f-5d3a-4678-8f2d-a6639f6a63a3 in datapath 8cda85d1-107e-4e78-8a2a-03fbeb4bf6c4 updated
Oct 06 14:09:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:54.589 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8cda85d1-107e-4e78-8a2a-03fbeb4bf6c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:09:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:54.590 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1f9c5dd4-5bb4-452a-8598-8d85db9d8706]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:09:54 compute-0 nova_compute[192903]: 2025-10-06 14:09:54.649 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:09:55 compute-0 nova_compute[192903]: 2025-10-06 14:09:55.155 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:09:55 compute-0 nova_compute[192903]: 2025-10-06 14:09:55.664 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:09:55 compute-0 nova_compute[192903]: 2025-10-06 14:09:55.664 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.354s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:09:56 compute-0 nova_compute[192903]: 2025-10-06 14:09:56.664 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:56 compute-0 nova_compute[192903]: 2025-10-06 14:09:56.665 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:56 compute-0 nova_compute[192903]: 2025-10-06 14:09:56.665 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:09:57 compute-0 nova_compute[192903]: 2025-10-06 14:09:57.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:58 compute-0 nova_compute[192903]: 2025-10-06 14:09:58.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:09:58 compute-0 nova_compute[192903]: 2025-10-06 14:09:58.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:58 compute-0 nova_compute[192903]: 2025-10-06 14:09:58.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:09:59 compute-0 podman[203308]: time="2025-10-06T14:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:09:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:09:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Oct 06 14:09:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:09:59.946 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:10:01 compute-0 openstack_network_exporter[205500]: ERROR   14:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:10:01 compute-0 openstack_network_exporter[205500]: ERROR   14:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:10:01 compute-0 openstack_network_exporter[205500]: ERROR   14:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:10:01 compute-0 openstack_network_exporter[205500]: ERROR   14:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:10:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:10:01 compute-0 openstack_network_exporter[205500]: ERROR   14:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:10:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:10:03 compute-0 nova_compute[192903]: 2025-10-06 14:10:03.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:03 compute-0 nova_compute[192903]: 2025-10-06 14:10:03.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:04 compute-0 podman[220205]: 2025-10-06 14:10:04.217340321 +0000 UTC m=+0.082305408 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:10:04 compute-0 ovn_controller[95205]: 2025-10-06T14:10:04Z|00106|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 06 14:10:07 compute-0 podman[220225]: 2025-10-06 14:10:07.235274714 +0000 UTC m=+0.097127185 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7)
Oct 06 14:10:08 compute-0 nova_compute[192903]: 2025-10-06 14:10:08.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:08 compute-0 nova_compute[192903]: 2025-10-06 14:10:08.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:11.367 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:10:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:11.367 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:10:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:11.368 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:10:13 compute-0 nova_compute[192903]: 2025-10-06 14:10:13.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:13 compute-0 nova_compute[192903]: 2025-10-06 14:10:13.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:18 compute-0 nova_compute[192903]: 2025-10-06 14:10:18.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:18 compute-0 nova_compute[192903]: 2025-10-06 14:10:18.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:23 compute-0 podman[220247]: 2025-10-06 14:10:23.198359571 +0000 UTC m=+0.064711476 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:10:23 compute-0 nova_compute[192903]: 2025-10-06 14:10:23.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:23 compute-0 nova_compute[192903]: 2025-10-06 14:10:23.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:25 compute-0 podman[220272]: 2025-10-06 14:10:25.206461503 +0000 UTC m=+0.064839598 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:10:25 compute-0 podman[220273]: 2025-10-06 14:10:25.214286987 +0000 UTC m=+0.067665725 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 06 14:10:25 compute-0 podman[220271]: 2025-10-06 14:10:25.260354235 +0000 UTC m=+0.125603931 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 06 14:10:26 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:26.575 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:70:95 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '040822eef8234394a03ec96f615f5048', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=01e7ff9b-7072-42b9-b412-c40a88736ea9) old=Port_Binding(mac=['fa:16:3e:e0:70:95'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '040822eef8234394a03ec96f615f5048', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:10:26 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:26.576 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 01e7ff9b-7072-42b9-b412-c40a88736ea9 in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 updated
Oct 06 14:10:26 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:26.577 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:10:26 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:26.579 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[39348838-c6e1-4855-bac1-a83816c9d0c1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:10:28 compute-0 nova_compute[192903]: 2025-10-06 14:10:28.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:28 compute-0 nova_compute[192903]: 2025-10-06 14:10:28.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:29 compute-0 podman[203308]: time="2025-10-06T14:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:10:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:10:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3006 "" "Go-http-client/1.1"
Oct 06 14:10:31 compute-0 openstack_network_exporter[205500]: ERROR   14:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:10:31 compute-0 openstack_network_exporter[205500]: ERROR   14:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:10:31 compute-0 openstack_network_exporter[205500]: ERROR   14:10:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:10:31 compute-0 openstack_network_exporter[205500]: ERROR   14:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:10:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:10:31 compute-0 openstack_network_exporter[205500]: ERROR   14:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:10:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:10:33 compute-0 nova_compute[192903]: 2025-10-06 14:10:33.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:33 compute-0 nova_compute[192903]: 2025-10-06 14:10:33.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:35 compute-0 podman[220335]: 2025-10-06 14:10:35.198912629 +0000 UTC m=+0.068456844 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 06 14:10:35 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:35.199 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:d4:04 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f3986533-3317-4e00-b6e0-91a07b54ab89', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3986533-3317-4e00-b6e0-91a07b54ab89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a06830f0-e7e5-4308-a654-035d4c2a6d10, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c8bc2b2c-ae88-492c-a9a6-74faddba3d9f) old=Port_Binding(mac=['fa:16:3e:d2:d4:04'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f3986533-3317-4e00-b6e0-91a07b54ab89', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3986533-3317-4e00-b6e0-91a07b54ab89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:10:35 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:35.199 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c8bc2b2c-ae88-492c-a9a6-74faddba3d9f in datapath f3986533-3317-4e00-b6e0-91a07b54ab89 updated
Oct 06 14:10:35 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:35.201 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f3986533-3317-4e00-b6e0-91a07b54ab89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:10:35 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:35.202 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[dffcdb10-b358-46e1-8932-8d9108807a25]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:10:38 compute-0 podman[220356]: 2025-10-06 14:10:38.242279559 +0000 UTC m=+0.099196360 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Oct 06 14:10:38 compute-0 nova_compute[192903]: 2025-10-06 14:10:38.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:38 compute-0 nova_compute[192903]: 2025-10-06 14:10:38.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:43 compute-0 nova_compute[192903]: 2025-10-06 14:10:43.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:43 compute-0 nova_compute[192903]: 2025-10-06 14:10:43.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:48 compute-0 nova_compute[192903]: 2025-10-06 14:10:48.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:48 compute-0 nova_compute[192903]: 2025-10-06 14:10:48.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:49 compute-0 nova_compute[192903]: 2025-10-06 14:10:49.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:10:51 compute-0 nova_compute[192903]: 2025-10-06 14:10:51.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:10:52 compute-0 nova_compute[192903]: 2025-10-06 14:10:52.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.087 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.087 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.596 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.597 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.597 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.597 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.796 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.797 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.838 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.839 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5865MB free_disk=73.3061294555664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.839 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.839 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:53 compute-0 nova_compute[192903]: 2025-10-06 14:10:53.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:54 compute-0 podman[220379]: 2025-10-06 14:10:54.206761821 +0000 UTC m=+0.062756687 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:10:54 compute-0 nova_compute[192903]: 2025-10-06 14:10:54.895 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:10:54 compute-0 nova_compute[192903]: 2025-10-06 14:10:54.895 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:10:53 up  1:11,  0 user,  load average: 0.28, 0.42, 0.43\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:10:54 compute-0 nova_compute[192903]: 2025-10-06 14:10:54.970 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:10:55 compute-0 nova_compute[192903]: 2025-10-06 14:10:55.476 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:10:55 compute-0 nova_compute[192903]: 2025-10-06 14:10:55.986 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:10:55 compute-0 nova_compute[192903]: 2025-10-06 14:10:55.987 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.147s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:10:56 compute-0 podman[220404]: 2025-10-06 14:10:56.220250891 +0000 UTC m=+0.071657254 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 06 14:10:56 compute-0 podman[220405]: 2025-10-06 14:10:56.220852758 +0000 UTC m=+0.070376351 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4)
Oct 06 14:10:56 compute-0 podman[220403]: 2025-10-06 14:10:56.256477249 +0000 UTC m=+0.117074478 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true)
Oct 06 14:10:56 compute-0 nova_compute[192903]: 2025-10-06 14:10:56.481 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:10:56 compute-0 nova_compute[192903]: 2025-10-06 14:10:56.481 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:10:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:58.136 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:10:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:10:58.137 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:10:58 compute-0 nova_compute[192903]: 2025-10-06 14:10:58.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:58 compute-0 nova_compute[192903]: 2025-10-06 14:10:58.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:10:58 compute-0 nova_compute[192903]: 2025-10-06 14:10:58.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:10:58 compute-0 nova_compute[192903]: 2025-10-06 14:10:58.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:58 compute-0 nova_compute[192903]: 2025-10-06 14:10:58.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:10:59 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 06 14:10:59 compute-0 nova_compute[192903]: 2025-10-06 14:10:59.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:10:59 compute-0 podman[203308]: time="2025-10-06T14:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:10:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:10:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Oct 06 14:11:01 compute-0 openstack_network_exporter[205500]: ERROR   14:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:11:01 compute-0 openstack_network_exporter[205500]: ERROR   14:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:11:01 compute-0 openstack_network_exporter[205500]: ERROR   14:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:11:01 compute-0 openstack_network_exporter[205500]: ERROR   14:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:11:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:11:01 compute-0 openstack_network_exporter[205500]: ERROR   14:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:11:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:11:03 compute-0 nova_compute[192903]: 2025-10-06 14:11:03.913 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "35d927dc-a7c1-4457-ae6d-ba716c35b931" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:03 compute-0 nova_compute[192903]: 2025-10-06 14:11:03.913 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:03 compute-0 nova_compute[192903]: 2025-10-06 14:11:03.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:03 compute-0 nova_compute[192903]: 2025-10-06 14:11:03.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:04 compute-0 nova_compute[192903]: 2025-10-06 14:11:04.422 2 DEBUG nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:11:04 compute-0 nova_compute[192903]: 2025-10-06 14:11:04.986 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:04 compute-0 nova_compute[192903]: 2025-10-06 14:11:04.987 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:04 compute-0 nova_compute[192903]: 2025-10-06 14:11:04.995 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:11:04 compute-0 nova_compute[192903]: 2025-10-06 14:11:04.995 2 INFO nova.compute.claims [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:11:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:05.139 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:06 compute-0 nova_compute[192903]: 2025-10-06 14:11:06.062 2 DEBUG nova.compute.provider_tree [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:11:06 compute-0 podman[220468]: 2025-10-06 14:11:06.222485321 +0000 UTC m=+0.083963853 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:11:06 compute-0 nova_compute[192903]: 2025-10-06 14:11:06.570 2 DEBUG nova.scheduler.client.report [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:11:07 compute-0 nova_compute[192903]: 2025-10-06 14:11:07.082 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:07 compute-0 nova_compute[192903]: 2025-10-06 14:11:07.082 2 DEBUG nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:11:07 compute-0 nova_compute[192903]: 2025-10-06 14:11:07.591 2 DEBUG nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:11:07 compute-0 nova_compute[192903]: 2025-10-06 14:11:07.592 2 DEBUG nova.network.neutron [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:11:07 compute-0 nova_compute[192903]: 2025-10-06 14:11:07.592 2 WARNING neutronclient.v2_0.client [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:07 compute-0 nova_compute[192903]: 2025-10-06 14:11:07.592 2 WARNING neutronclient.v2_0.client [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:08 compute-0 nova_compute[192903]: 2025-10-06 14:11:08.104 2 INFO nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:11:08 compute-0 nova_compute[192903]: 2025-10-06 14:11:08.550 2 DEBUG nova.network.neutron [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Successfully created port: 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:11:08 compute-0 nova_compute[192903]: 2025-10-06 14:11:08.705 2 DEBUG nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:11:08 compute-0 nova_compute[192903]: 2025-10-06 14:11:08.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:08 compute-0 nova_compute[192903]: 2025-10-06 14:11:08.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:09 compute-0 podman[220489]: 2025-10-06 14:11:09.226319219 +0000 UTC m=+0.075584730 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container)
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.256 2 DEBUG nova.network.neutron [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Successfully updated port: 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.321 2 DEBUG nova.compute.manager [req-061f8a6a-32f3-473f-bcc6-db3e1a836d62 req-a98f8335-eea6-4a03-ab54-c72c57f79a20 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Received event network-changed-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.321 2 DEBUG nova.compute.manager [req-061f8a6a-32f3-473f-bcc6-db3e1a836d62 req-a98f8335-eea6-4a03-ab54-c72c57f79a20 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Refreshing instance network info cache due to event network-changed-4150e4a6-f7b6-4478-bdfa-f6179da74cf7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.322 2 DEBUG oslo_concurrency.lockutils [req-061f8a6a-32f3-473f-bcc6-db3e1a836d62 req-a98f8335-eea6-4a03-ab54-c72c57f79a20 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-35d927dc-a7c1-4457-ae6d-ba716c35b931" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.322 2 DEBUG oslo_concurrency.lockutils [req-061f8a6a-32f3-473f-bcc6-db3e1a836d62 req-a98f8335-eea6-4a03-ab54-c72c57f79a20 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-35d927dc-a7c1-4457-ae6d-ba716c35b931" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.322 2 DEBUG nova.network.neutron [req-061f8a6a-32f3-473f-bcc6-db3e1a836d62 req-a98f8335-eea6-4a03-ab54-c72c57f79a20 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Refreshing network info cache for port 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.734 2 DEBUG nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.736 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.737 2 INFO nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Creating image(s)
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.737 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "/var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.738 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "/var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.739 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "/var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.739 2 DEBUG oslo_utils.imageutils.format_inspector [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.745 2 DEBUG oslo_utils.imageutils.format_inspector [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.747 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.764 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "refresh_cache-35d927dc-a7c1-4457-ae6d-ba716c35b931" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.829 2 WARNING neutronclient.v2_0.client [req-061f8a6a-32f3-473f-bcc6-db3e1a836d62 req-a98f8335-eea6-4a03-ab54-c72c57f79a20 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.837 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.837 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.838 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.838 2 DEBUG oslo_utils.imageutils.format_inspector [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.841 2 DEBUG oslo_utils.imageutils.format_inspector [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.842 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.926 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.928 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.975 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.976 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:09 compute-0 nova_compute[192903]: 2025-10-06 14:11:09.977 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.038 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.039 2 DEBUG nova.virt.disk.api [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Checking if we can resize image /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.040 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.126 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.127 2 DEBUG nova.virt.disk.api [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Cannot resize image /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.128 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.128 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Ensure instance console log exists: /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.128 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.129 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.129 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:10 compute-0 nova_compute[192903]: 2025-10-06 14:11:10.931 2 DEBUG nova.network.neutron [req-061f8a6a-32f3-473f-bcc6-db3e1a836d62 req-a98f8335-eea6-4a03-ab54-c72c57f79a20 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:11:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:11.368 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:11.369 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:11.369 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:12 compute-0 nova_compute[192903]: 2025-10-06 14:11:12.028 2 DEBUG nova.network.neutron [req-061f8a6a-32f3-473f-bcc6-db3e1a836d62 req-a98f8335-eea6-4a03-ab54-c72c57f79a20 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:11:12 compute-0 nova_compute[192903]: 2025-10-06 14:11:12.536 2 DEBUG oslo_concurrency.lockutils [req-061f8a6a-32f3-473f-bcc6-db3e1a836d62 req-a98f8335-eea6-4a03-ab54-c72c57f79a20 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-35d927dc-a7c1-4457-ae6d-ba716c35b931" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:11:12 compute-0 nova_compute[192903]: 2025-10-06 14:11:12.537 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquired lock "refresh_cache-35d927dc-a7c1-4457-ae6d-ba716c35b931" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:11:12 compute-0 nova_compute[192903]: 2025-10-06 14:11:12.537 2 DEBUG nova.network.neutron [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:11:13 compute-0 nova_compute[192903]: 2025-10-06 14:11:13.917 2 DEBUG nova.network.neutron [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:11:13 compute-0 nova_compute[192903]: 2025-10-06 14:11:13.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:13 compute-0 nova_compute[192903]: 2025-10-06 14:11:13.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:14 compute-0 nova_compute[192903]: 2025-10-06 14:11:14.211 2 WARNING neutronclient.v2_0.client [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:14 compute-0 nova_compute[192903]: 2025-10-06 14:11:14.493 2 DEBUG nova.network.neutron [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Updating instance_info_cache with network_info: [{"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.002 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Releasing lock "refresh_cache-35d927dc-a7c1-4457-ae6d-ba716c35b931" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.002 2 DEBUG nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Instance network_info: |[{"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.004 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Start _get_guest_xml network_info=[{"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.008 2 WARNING nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.009 2 DEBUG nova.virt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1981519713', uuid='35d927dc-a7c1-4457-ae6d-ba716c35b931'), owner=OwnerMeta(userid='f242e9aec50346eaa7b3bddbda127d84', username='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin', projectid='58ece9e5771a44c2918fd8f7783186f0', projectname='tempest-TestExecuteHostMaintenanceStrategy-251874218'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759759875.0092852) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.012 2 DEBUG nova.virt.libvirt.host [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.013 2 DEBUG nova.virt.libvirt.host [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.015 2 DEBUG nova.virt.libvirt.host [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.016 2 DEBUG nova.virt.libvirt.host [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.016 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.016 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.017 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.017 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.017 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.017 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.018 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.018 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.018 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.018 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.018 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.018 2 DEBUG nova.virt.hardware [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.022 2 DEBUG nova.virt.libvirt.vif [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:11:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1981519713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1981519713',id=13,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-yg2nc0kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:11:08Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=35d927dc-a7c1-4457-ae6d-ba716c35b931,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.022 2 DEBUG nova.network.os_vif_util [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converting VIF {"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.023 2 DEBUG nova.network.os_vif_util [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:a2:ea,bridge_name='br-int',has_traffic_filtering=True,id=4150e4a6-f7b6-4478-bdfa-f6179da74cf7,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4150e4a6-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.023 2 DEBUG nova.objects.instance [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 35d927dc-a7c1-4457-ae6d-ba716c35b931 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.532 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <uuid>35d927dc-a7c1-4457-ae6d-ba716c35b931</uuid>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <name>instance-0000000d</name>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1981519713</nova:name>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:11:15</nova:creationTime>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:11:15 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:11:15 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:user uuid="f242e9aec50346eaa7b3bddbda127d84">tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin</nova:user>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:project uuid="58ece9e5771a44c2918fd8f7783186f0">tempest-TestExecuteHostMaintenanceStrategy-251874218</nova:project>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         <nova:port uuid="4150e4a6-f7b6-4478-bdfa-f6179da74cf7">
Oct 06 14:11:15 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <system>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <entry name="serial">35d927dc-a7c1-4457-ae6d-ba716c35b931</entry>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <entry name="uuid">35d927dc-a7c1-4457-ae6d-ba716c35b931</entry>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     </system>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <os>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   </os>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <features>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   </features>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk.config"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:60:a2:ea"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <target dev="tap4150e4a6-f7"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/console.log" append="off"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <video>
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     </video>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:11:15 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:11:15 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:11:15 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:11:15 compute-0 nova_compute[192903]: </domain>
Oct 06 14:11:15 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.533 2 DEBUG nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Preparing to wait for external event network-vif-plugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.533 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.534 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.534 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.534 2 DEBUG nova.virt.libvirt.vif [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:11:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1981519713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1981519713',id=13,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-yg2nc0kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:11:08Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=35d927dc-a7c1-4457-ae6d-ba716c35b931,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.535 2 DEBUG nova.network.os_vif_util [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converting VIF {"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.535 2 DEBUG nova.network.os_vif_util [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:a2:ea,bridge_name='br-int',has_traffic_filtering=True,id=4150e4a6-f7b6-4478-bdfa-f6179da74cf7,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4150e4a6-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.535 2 DEBUG os_vif [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:a2:ea,bridge_name='br-int',has_traffic_filtering=True,id=4150e4a6-f7b6-4478-bdfa-f6179da74cf7,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4150e4a6-f7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.536 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.536 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.537 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f64fcd75-d51f-53f8-91c2-3095cb527aca', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.542 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4150e4a6-f7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.542 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4150e4a6-f7, col_values=(('qos', UUID('aa0d2560-0507-4220-9d4b-a211766b5c37')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.542 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4150e4a6-f7, col_values=(('external_ids', {'iface-id': '4150e4a6-f7b6-4478-bdfa-f6179da74cf7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:a2:ea', 'vm-uuid': '35d927dc-a7c1-4457-ae6d-ba716c35b931'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:15 compute-0 NetworkManager[52035]: <info>  [1759759875.5445] manager: (tap4150e4a6-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:15 compute-0 nova_compute[192903]: 2025-10-06 14:11:15.549 2 INFO os_vif [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:a2:ea,bridge_name='br-int',has_traffic_filtering=True,id=4150e4a6-f7b6-4478-bdfa-f6179da74cf7,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4150e4a6-f7')
Oct 06 14:11:17 compute-0 nova_compute[192903]: 2025-10-06 14:11:17.088 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:11:17 compute-0 nova_compute[192903]: 2025-10-06 14:11:17.089 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:11:17 compute-0 nova_compute[192903]: 2025-10-06 14:11:17.089 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] No VIF found with MAC fa:16:3e:60:a2:ea, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:11:17 compute-0 nova_compute[192903]: 2025-10-06 14:11:17.090 2 INFO nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Using config drive
Oct 06 14:11:17 compute-0 nova_compute[192903]: 2025-10-06 14:11:17.602 2 WARNING neutronclient.v2_0.client [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:17 compute-0 nova_compute[192903]: 2025-10-06 14:11:17.806 2 INFO nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Creating config drive at /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk.config
Oct 06 14:11:17 compute-0 nova_compute[192903]: 2025-10-06 14:11:17.817 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp4k95sm27 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:17 compute-0 nova_compute[192903]: 2025-10-06 14:11:17.962 2 DEBUG oslo_concurrency.processutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp4k95sm27" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:18 compute-0 kernel: tap4150e4a6-f7: entered promiscuous mode
Oct 06 14:11:18 compute-0 NetworkManager[52035]: <info>  [1759759878.0387] manager: (tap4150e4a6-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Oct 06 14:11:18 compute-0 ovn_controller[95205]: 2025-10-06T14:11:18Z|00107|binding|INFO|Claiming lport 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 for this chassis.
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:18 compute-0 ovn_controller[95205]: 2025-10-06T14:11:18Z|00108|binding|INFO|4150e4a6-f7b6-4478-bdfa-f6179da74cf7: Claiming fa:16:3e:60:a2:ea 10.100.0.12
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.065 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:a2:ea 10.100.0.12'], port_security=['fa:16:3e:60:a2:ea 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '35d927dc-a7c1-4457-ae6d-ba716c35b931', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=4150e4a6-f7b6-4478-bdfa-f6179da74cf7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.068 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 bound to our chassis
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.071 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:11:18 compute-0 systemd-machined[152985]: New machine qemu-9-instance-0000000d.
Oct 06 14:11:18 compute-0 systemd-udevd[220547]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.097 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b2dee5e1-e7cc-421f-aad3-002596d0b65c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.099 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap37630f0a-81 in ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.102 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap37630f0a-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.102 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5dc9bca4-2beb-4d83-94d8-3ce8a56ef13b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.104 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b9698753-4d75-4744-ab05-d17d43beb1b4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 NetworkManager[52035]: <info>  [1759759878.1102] device (tap4150e4a6-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:11:18 compute-0 NetworkManager[52035]: <info>  [1759759878.1109] device (tap4150e4a6-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:11:18 compute-0 ovn_controller[95205]: 2025-10-06T14:11:18Z|00109|binding|INFO|Setting lport 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 ovn-installed in OVS
Oct 06 14:11:18 compute-0 ovn_controller[95205]: 2025-10-06T14:11:18Z|00110|binding|INFO|Setting lport 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 up in Southbound
Oct 06 14:11:18 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000d.
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.117 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[e30e1341-e12d-4ea1-9d7b-99716421dffa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.127 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e008a262-8ffc-4cc6-a357-a382c3ab1a6a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.164 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[2b54fc5e-2dbb-4016-9362-3e56be6fcdff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.170 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a7073046-6de1-4d77-8062-87271f11c737]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 systemd-udevd[220550]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:11:18 compute-0 NetworkManager[52035]: <info>  [1759759878.1718] manager: (tap37630f0a-80): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.221 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[5ecf57a4-3fcf-4ffa-8a16-d043576605fb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.224 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[ca237dc4-6b53-4f7a-b11f-4f8686f7619d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 NetworkManager[52035]: <info>  [1759759878.2518] device (tap37630f0a-80): carrier: link connected
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.265 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[c68ae260-31ce-4de3-acb0-e1cf6aa6d1c3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.287 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9b65666d-1b08-410e-ad11-5a268627e907]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37630f0a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:70:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433989, 'reachable_time': 23205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220579, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.296 2 DEBUG nova.compute.manager [req-c6da586f-b35d-43ed-b8d9-6b8e43cde43b req-870fef99-b206-4565-98e4-5413e4b8cd0f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Received event network-vif-plugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.296 2 DEBUG oslo_concurrency.lockutils [req-c6da586f-b35d-43ed-b8d9-6b8e43cde43b req-870fef99-b206-4565-98e4-5413e4b8cd0f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.297 2 DEBUG oslo_concurrency.lockutils [req-c6da586f-b35d-43ed-b8d9-6b8e43cde43b req-870fef99-b206-4565-98e4-5413e4b8cd0f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.297 2 DEBUG oslo_concurrency.lockutils [req-c6da586f-b35d-43ed-b8d9-6b8e43cde43b req-870fef99-b206-4565-98e4-5413e4b8cd0f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.298 2 DEBUG nova.compute.manager [req-c6da586f-b35d-43ed-b8d9-6b8e43cde43b req-870fef99-b206-4565-98e4-5413e4b8cd0f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Processing event network-vif-plugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.310 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[42a4687f-d1a1-4b6b-9cdb-b3ca957eae2a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee0:7095'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433989, 'tstamp': 433989}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220580, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.331 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8d15d6-dd64-4e8e-a17c-47c5eaf157be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37630f0a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:70:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433989, 'reachable_time': 23205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220581, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.377 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8c3cce87-b6ac-49ee-aefb-d0385d7fdb11]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.452 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[069af8ef-0ce8-4dc3-948f-8059814452b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.454 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37630f0a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.454 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.454 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37630f0a-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:18 compute-0 kernel: tap37630f0a-80: entered promiscuous mode
Oct 06 14:11:18 compute-0 NetworkManager[52035]: <info>  [1759759878.4566] manager: (tap37630f0a-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.460 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap37630f0a-80, col_values=(('external_ids', {'iface-id': '01e7ff9b-7072-42b9-b412-c40a88736ea9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:18 compute-0 ovn_controller[95205]: 2025-10-06T14:11:18Z|00111|binding|INFO|Releasing lport 01e7ff9b-7072-42b9-b412-c40a88736ea9 from this chassis (sb_readonly=0)
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.464 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9571f511-7ec4-42b9-abe9-b141524c609c]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.465 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.465 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.466 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 37630f0a-8aad-4e9a-8c81-a92f8d673f93 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.466 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.466 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b1c555-07ca-4640-a919-010abd8e773d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.467 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.468 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8405aadc-7553-4250-8f9f-385c932b2c2c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.468 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:11:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:18.469 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'env', 'PROCESS_TAG=haproxy-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/37630f0a-8aad-4e9a-8c81-a92f8d673f93.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:18 compute-0 podman[220613]: 2025-10-06 14:11:18.918122299 +0000 UTC m=+0.068108880 container create a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 06 14:11:18 compute-0 nova_compute[192903]: 2025-10-06 14:11:18.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:18 compute-0 systemd[1]: Started libpod-conmon-a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298.scope.
Oct 06 14:11:18 compute-0 podman[220613]: 2025-10-06 14:11:18.878160452 +0000 UTC m=+0.028147133 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:11:18 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:11:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca27a79e60c842f1ecc54b46c3746d0509c8f0edd1effc174f96abbc0b8a0d9c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:11:19 compute-0 podman[220613]: 2025-10-06 14:11:19.016085995 +0000 UTC m=+0.166072636 container init a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:11:19 compute-0 podman[220613]: 2025-10-06 14:11:19.0226193 +0000 UTC m=+0.172605901 container start a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Oct 06 14:11:19 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[220635]: [NOTICE]   (220639) : New worker (220641) forked
Oct 06 14:11:19 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[220635]: [NOTICE]   (220639) : Loading success.
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.451 2 DEBUG nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.454 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.457 2 INFO nova.virt.libvirt.driver [-] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Instance spawned successfully.
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.458 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.975 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.975 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.976 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.976 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.977 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:11:19 compute-0 nova_compute[192903]: 2025-10-06 14:11:19.977 2 DEBUG nova.virt.libvirt.driver [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:11:20 compute-0 nova_compute[192903]: 2025-10-06 14:11:20.360 2 DEBUG nova.compute.manager [req-4371c266-3719-4dcb-8d01-fbb04c7e0a1d req-060ea157-3778-4bee-9fc1-4c92110f7622 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Received event network-vif-plugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:11:20 compute-0 nova_compute[192903]: 2025-10-06 14:11:20.360 2 DEBUG oslo_concurrency.lockutils [req-4371c266-3719-4dcb-8d01-fbb04c7e0a1d req-060ea157-3778-4bee-9fc1-4c92110f7622 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:20 compute-0 nova_compute[192903]: 2025-10-06 14:11:20.361 2 DEBUG oslo_concurrency.lockutils [req-4371c266-3719-4dcb-8d01-fbb04c7e0a1d req-060ea157-3778-4bee-9fc1-4c92110f7622 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:20 compute-0 nova_compute[192903]: 2025-10-06 14:11:20.361 2 DEBUG oslo_concurrency.lockutils [req-4371c266-3719-4dcb-8d01-fbb04c7e0a1d req-060ea157-3778-4bee-9fc1-4c92110f7622 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:20 compute-0 nova_compute[192903]: 2025-10-06 14:11:20.361 2 DEBUG nova.compute.manager [req-4371c266-3719-4dcb-8d01-fbb04c7e0a1d req-060ea157-3778-4bee-9fc1-4c92110f7622 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] No waiting events found dispatching network-vif-plugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:11:20 compute-0 nova_compute[192903]: 2025-10-06 14:11:20.362 2 WARNING nova.compute.manager [req-4371c266-3719-4dcb-8d01-fbb04c7e0a1d req-060ea157-3778-4bee-9fc1-4c92110f7622 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Received unexpected event network-vif-plugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 for instance with vm_state building and task_state spawning.
Oct 06 14:11:20 compute-0 nova_compute[192903]: 2025-10-06 14:11:20.486 2 INFO nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Took 10.75 seconds to spawn the instance on the hypervisor.
Oct 06 14:11:20 compute-0 nova_compute[192903]: 2025-10-06 14:11:20.487 2 DEBUG nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:11:20 compute-0 nova_compute[192903]: 2025-10-06 14:11:20.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:21 compute-0 nova_compute[192903]: 2025-10-06 14:11:21.027 2 INFO nova.compute.manager [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Took 16.09 seconds to build instance.
Oct 06 14:11:21 compute-0 nova_compute[192903]: 2025-10-06 14:11:21.533 2 DEBUG oslo_concurrency.lockutils [None req-90b0b177-c704-4b20-b383-4aecf3201174 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.619s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:23 compute-0 nova_compute[192903]: 2025-10-06 14:11:23.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:25 compute-0 podman[220650]: 2025-10-06 14:11:25.239487631 +0000 UTC m=+0.086697316 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:11:25 compute-0 nova_compute[192903]: 2025-10-06 14:11:25.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:27 compute-0 podman[220677]: 2025-10-06 14:11:27.226310309 +0000 UTC m=+0.075463296 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 06 14:11:27 compute-0 podman[220676]: 2025-10-06 14:11:27.246679363 +0000 UTC m=+0.092996084 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Oct 06 14:11:27 compute-0 podman[220675]: 2025-10-06 14:11:27.289662311 +0000 UTC m=+0.143107173 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 06 14:11:28 compute-0 nova_compute[192903]: 2025-10-06 14:11:28.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:29 compute-0 podman[203308]: time="2025-10-06T14:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:11:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:11:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Oct 06 14:11:30 compute-0 nova_compute[192903]: 2025-10-06 14:11:30.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:31 compute-0 openstack_network_exporter[205500]: ERROR   14:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:11:31 compute-0 openstack_network_exporter[205500]: ERROR   14:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:11:31 compute-0 openstack_network_exporter[205500]: ERROR   14:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:11:31 compute-0 openstack_network_exporter[205500]: ERROR   14:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:11:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:11:31 compute-0 openstack_network_exporter[205500]: ERROR   14:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:11:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:11:33 compute-0 ovn_controller[95205]: 2025-10-06T14:11:33Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:60:a2:ea 10.100.0.12
Oct 06 14:11:33 compute-0 ovn_controller[95205]: 2025-10-06T14:11:33Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:a2:ea 10.100.0.12
Oct 06 14:11:33 compute-0 nova_compute[192903]: 2025-10-06 14:11:33.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:34 compute-0 nova_compute[192903]: 2025-10-06 14:11:34.079 2 DEBUG nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Creating tmpfile /var/lib/nova/instances/tmprxq380es to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:11:34 compute-0 nova_compute[192903]: 2025-10-06 14:11:34.080 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:34 compute-0 nova_compute[192903]: 2025-10-06 14:11:34.094 2 DEBUG nova.compute.manager [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprxq380es',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:11:35 compute-0 nova_compute[192903]: 2025-10-06 14:11:35.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:36 compute-0 nova_compute[192903]: 2025-10-06 14:11:36.134 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:37 compute-0 podman[220749]: 2025-10-06 14:11:37.225013916 +0000 UTC m=+0.088337510 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 06 14:11:38 compute-0 nova_compute[192903]: 2025-10-06 14:11:38.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:40 compute-0 podman[220770]: 2025-10-06 14:11:40.227164829 +0000 UTC m=+0.085024971 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal)
Oct 06 14:11:40 compute-0 nova_compute[192903]: 2025-10-06 14:11:40.297 2 DEBUG nova.compute.manager [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprxq380es',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='971068bc-edb7-4fe2-8822-6603739b1a9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:11:40 compute-0 nova_compute[192903]: 2025-10-06 14:11:40.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:41 compute-0 nova_compute[192903]: 2025-10-06 14:11:41.314 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-971068bc-edb7-4fe2-8822-6603739b1a9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:11:41 compute-0 nova_compute[192903]: 2025-10-06 14:11:41.315 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-971068bc-edb7-4fe2-8822-6603739b1a9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:11:41 compute-0 nova_compute[192903]: 2025-10-06 14:11:41.315 2 DEBUG nova.network.neutron [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:11:41 compute-0 nova_compute[192903]: 2025-10-06 14:11:41.825 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:43 compute-0 nova_compute[192903]: 2025-10-06 14:11:43.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:44 compute-0 nova_compute[192903]: 2025-10-06 14:11:44.032 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:44 compute-0 nova_compute[192903]: 2025-10-06 14:11:44.941 2 DEBUG nova.network.neutron [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Updating instance_info_cache with network_info: [{"id": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "address": "fa:16:3e:ae:2d:58", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bfbc46b-9b", "ovs_interfaceid": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.449 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-971068bc-edb7-4fe2-8822-6603739b1a9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.465 2 DEBUG nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprxq380es',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='971068bc-edb7-4fe2-8822-6603739b1a9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.466 2 DEBUG nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Creating instance directory: /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.467 2 DEBUG nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Creating disk.info with the contents: {'/var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk': 'qcow2', '/var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.467 2 DEBUG nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.468 2 DEBUG nova.objects.instance [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid 971068bc-edb7-4fe2-8822-6603739b1a9c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.975 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.983 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:11:45 compute-0 nova_compute[192903]: 2025-10-06 14:11:45.985 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.079 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.081 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.082 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.083 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.090 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.090 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.166 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.168 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.224 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.225 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.226 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.293 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.294 2 DEBUG nova.virt.disk.api [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.295 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.351 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.352 2 DEBUG nova.virt.disk.api [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.353 2 DEBUG nova.objects.instance [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid 971068bc-edb7-4fe2-8822-6603739b1a9c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.864 2 DEBUG nova.objects.base [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<971068bc-edb7-4fe2-8822-6603739b1a9c> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.865 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.894 2 DEBUG oslo_concurrency.processutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk.config 497664" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.895 2 DEBUG nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.897 2 DEBUG nova.virt.libvirt.vif [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:10:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-669739048',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-669739048',id=12,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:10:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-qla3cmqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:10:59Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=971068bc-edb7-4fe2-8822-6603739b1a9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "address": "fa:16:3e:ae:2d:58", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4bfbc46b-9b", "ovs_interfaceid": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.898 2 DEBUG nova.network.os_vif_util [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "address": "fa:16:3e:ae:2d:58", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4bfbc46b-9b", "ovs_interfaceid": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.899 2 DEBUG nova.network.os_vif_util [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:2d:58,bridge_name='br-int',has_traffic_filtering=True,id=4bfbc46b-9b1f-4bfa-83da-c52787c02064,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bfbc46b-9b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.899 2 DEBUG os_vif [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:2d:58,bridge_name='br-int',has_traffic_filtering=True,id=4bfbc46b-9b1f-4bfa-83da-c52787c02064,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bfbc46b-9b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.901 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.901 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.902 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd6d21324-07b6-5166-8629-d18e2a0fa05a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.910 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bfbc46b-9b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.910 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4bfbc46b-9b, col_values=(('qos', UUID('22f12747-1cf9-4421-9698-dbca63d5be19')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.910 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4bfbc46b-9b, col_values=(('external_ids', {'iface-id': '4bfbc46b-9b1f-4bfa-83da-c52787c02064', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:2d:58', 'vm-uuid': '971068bc-edb7-4fe2-8822-6603739b1a9c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:46 compute-0 NetworkManager[52035]: <info>  [1759759906.9131] manager: (tap4bfbc46b-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.919 2 INFO os_vif [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:2d:58,bridge_name='br-int',has_traffic_filtering=True,id=4bfbc46b-9b1f-4bfa-83da-c52787c02064,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bfbc46b-9b')
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.920 2 DEBUG nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.920 2 DEBUG nova.compute.manager [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprxq380es',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='971068bc-edb7-4fe2-8822-6603739b1a9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.921 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:46 compute-0 nova_compute[192903]: 2025-10-06 14:11:46.992 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:47 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:47.595 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:11:47 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:47.596 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:11:47 compute-0 nova_compute[192903]: 2025-10-06 14:11:47.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:47 compute-0 nova_compute[192903]: 2025-10-06 14:11:47.984 2 DEBUG nova.network.neutron [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Port 4bfbc46b-9b1f-4bfa-83da-c52787c02064 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:11:48 compute-0 nova_compute[192903]: 2025-10-06 14:11:48.005 2 DEBUG nova.compute.manager [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprxq380es',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='971068bc-edb7-4fe2-8822-6603739b1a9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:11:48 compute-0 ovn_controller[95205]: 2025-10-06T14:11:48Z|00112|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 06 14:11:48 compute-0 nova_compute[192903]: 2025-10-06 14:11:48.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:49 compute-0 nova_compute[192903]: 2025-10-06 14:11:49.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:11:51 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 06 14:11:51 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 06 14:11:51 compute-0 kernel: tap4bfbc46b-9b: entered promiscuous mode
Oct 06 14:11:51 compute-0 nova_compute[192903]: 2025-10-06 14:11:51.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:51 compute-0 NetworkManager[52035]: <info>  [1759759911.4861] manager: (tap4bfbc46b-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Oct 06 14:11:51 compute-0 ovn_controller[95205]: 2025-10-06T14:11:51Z|00113|binding|INFO|Claiming lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 for this additional chassis.
Oct 06 14:11:51 compute-0 ovn_controller[95205]: 2025-10-06T14:11:51Z|00114|binding|INFO|4bfbc46b-9b1f-4bfa-83da-c52787c02064: Claiming fa:16:3e:ae:2d:58 10.100.0.5
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.492 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:2d:58 10.100.0.5'], port_security=['fa:16:3e:ae:2d:58 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '971068bc-edb7-4fe2-8822-6603739b1a9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=4bfbc46b-9b1f-4bfa-83da-c52787c02064) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.494 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 4bfbc46b-9b1f-4bfa-83da-c52787c02064 in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 unbound from our chassis
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.496 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:11:51 compute-0 ovn_controller[95205]: 2025-10-06T14:11:51Z|00115|binding|INFO|Setting lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 ovn-installed in OVS
Oct 06 14:11:51 compute-0 nova_compute[192903]: 2025-10-06 14:11:51.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:51 compute-0 nova_compute[192903]: 2025-10-06 14:11:51.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:51 compute-0 nova_compute[192903]: 2025-10-06 14:11:51.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.524 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[091093f2-175b-40b5-9764-b1aede47cde0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:51 compute-0 systemd-machined[152985]: New machine qemu-10-instance-0000000c.
Oct 06 14:11:51 compute-0 systemd-udevd[220845]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:11:51 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000c.
Oct 06 14:11:51 compute-0 NetworkManager[52035]: <info>  [1759759911.5487] device (tap4bfbc46b-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:11:51 compute-0 NetworkManager[52035]: <info>  [1759759911.5496] device (tap4bfbc46b-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.564 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[7f249eaf-7fe6-41c3-8cae-e7579eece523]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.567 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[225325bf-0f2f-4f11-b607-01b41ef84fc9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.597 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[54be799b-296e-4b39-a513-35184ea7fb93]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.616 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ab648884-72ff-4e98-acd3-470584ebdd91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37630f0a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:70:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433989, 'reachable_time': 23205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220857, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.634 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8761aac9-7f38-4f69-96ef-6680d148f9c8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap37630f0a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434004, 'tstamp': 434004}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220859, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap37630f0a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434008, 'tstamp': 434008}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220859, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.636 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37630f0a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:51 compute-0 nova_compute[192903]: 2025-10-06 14:11:51.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.639 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37630f0a-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.640 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.640 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap37630f0a-80, col_values=(('external_ids', {'iface-id': '01e7ff9b-7072-42b9-b412-c40a88736ea9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.641 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:11:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:51.642 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e61dee06-e777-4652-bcda-5c3f3844db14]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-37630f0a-8aad-4e9a-8c81-a92f8d673f93\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 37630f0a-8aad-4e9a-8c81-a92f8d673f93\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:11:51 compute-0 nova_compute[192903]: 2025-10-06 14:11:51.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:11:52.597 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:11:53 compute-0 nova_compute[192903]: 2025-10-06 14:11:53.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:11:53 compute-0 nova_compute[192903]: 2025-10-06 14:11:53.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:11:53 compute-0 nova_compute[192903]: 2025-10-06 14:11:53.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:54 compute-0 nova_compute[192903]: 2025-10-06 14:11:54.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:54 compute-0 nova_compute[192903]: 2025-10-06 14:11:54.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:54 compute-0 nova_compute[192903]: 2025-10-06 14:11:54.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:54 compute-0 nova_compute[192903]: 2025-10-06 14:11:54.099 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:11:54 compute-0 ovn_controller[95205]: 2025-10-06T14:11:54Z|00116|binding|INFO|Claiming lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 for this chassis.
Oct 06 14:11:54 compute-0 ovn_controller[95205]: 2025-10-06T14:11:54Z|00117|binding|INFO|4bfbc46b-9b1f-4bfa-83da-c52787c02064: Claiming fa:16:3e:ae:2d:58 10.100.0.5
Oct 06 14:11:54 compute-0 ovn_controller[95205]: 2025-10-06T14:11:54Z|00118|binding|INFO|Setting lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 up in Southbound
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.146 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.206 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.208 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.286 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.295 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.358 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.359 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.393 2 INFO nova.compute.manager [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Post operation of migration started
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.394 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.432 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.620 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.622 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.653 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.654 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5537MB free_disk=73.24413299560547GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.654 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.655 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.953 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:55 compute-0 nova_compute[192903]: 2025-10-06 14:11:55.954 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:56 compute-0 nova_compute[192903]: 2025-10-06 14:11:56.114 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-971068bc-edb7-4fe2-8822-6603739b1a9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:11:56 compute-0 nova_compute[192903]: 2025-10-06 14:11:56.115 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-971068bc-edb7-4fe2-8822-6603739b1a9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:11:56 compute-0 nova_compute[192903]: 2025-10-06 14:11:56.115 2 DEBUG nova.network.neutron [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:11:56 compute-0 podman[220896]: 2025-10-06 14:11:56.204028852 +0000 UTC m=+0.063465306 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:11:56 compute-0 nova_compute[192903]: 2025-10-06 14:11:56.673 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:56 compute-0 nova_compute[192903]: 2025-10-06 14:11:56.791 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Migration for instance 971068bc-edb7-4fe2-8822-6603739b1a9c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 06 14:11:56 compute-0 nova_compute[192903]: 2025-10-06 14:11:56.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:57 compute-0 nova_compute[192903]: 2025-10-06 14:11:57.252 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:11:57 compute-0 nova_compute[192903]: 2025-10-06 14:11:57.418 2 DEBUG nova.network.neutron [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Updating instance_info_cache with network_info: [{"id": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "address": "fa:16:3e:ae:2d:58", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bfbc46b-9b", "ovs_interfaceid": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:11:57 compute-0 nova_compute[192903]: 2025-10-06 14:11:57.487 2 INFO nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Updating resource usage from migration 93285829-4902-4070-bb19-762e00d8349c
Oct 06 14:11:57 compute-0 nova_compute[192903]: 2025-10-06 14:11:57.487 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Starting to track incoming migration 93285829-4902-4070-bb19-762e00d8349c with flavor 8cb06c85-e9e7-417f-906b-1f7cf29f7de9 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 06 14:11:57 compute-0 nova_compute[192903]: 2025-10-06 14:11:57.924 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-971068bc-edb7-4fe2-8822-6603739b1a9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.016 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 35d927dc-a7c1-4457-ae6d-ba716c35b931 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:11:58 compute-0 podman[220923]: 2025-10-06 14:11:58.220815141 +0000 UTC m=+0.066033855 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:11:58 compute-0 podman[220921]: 2025-10-06 14:11:58.258417395 +0000 UTC m=+0.111036316 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Oct 06 14:11:58 compute-0 podman[220922]: 2025-10-06 14:11:58.260144561 +0000 UTC m=+0.106707251 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.449 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.523 2 WARNING nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 971068bc-edb7-4fe2-8822-6603739b1a9c has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.524 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.524 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:11:55 up  1:12,  0 user,  load average: 0.39, 0.44, 0.44\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_58ece9e5771a44c2918fd8f7783186f0': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.540 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.562 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.562 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.594 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.612 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.660 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:11:58 compute-0 nova_compute[192903]: 2025-10-06 14:11:58.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:11:59 compute-0 nova_compute[192903]: 2025-10-06 14:11:59.170 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:11:59 compute-0 nova_compute[192903]: 2025-10-06 14:11:59.681 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:11:59 compute-0 nova_compute[192903]: 2025-10-06 14:11:59.682 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.027s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:59 compute-0 nova_compute[192903]: 2025-10-06 14:11:59.682 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 1.234s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:11:59 compute-0 nova_compute[192903]: 2025-10-06 14:11:59.683 2 DEBUG oslo_concurrency.lockutils [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:11:59 compute-0 nova_compute[192903]: 2025-10-06 14:11:59.688 2 INFO nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:11:59 compute-0 virtqemud[192802]: Domain id=10 name='instance-0000000c' uuid=971068bc-edb7-4fe2-8822-6603739b1a9c is tainted: custom-monitor
Oct 06 14:11:59 compute-0 podman[203308]: time="2025-10-06T14:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:11:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:11:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Oct 06 14:12:00 compute-0 nova_compute[192903]: 2025-10-06 14:12:00.684 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:00 compute-0 nova_compute[192903]: 2025-10-06 14:12:00.685 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:00 compute-0 nova_compute[192903]: 2025-10-06 14:12:00.686 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:00 compute-0 nova_compute[192903]: 2025-10-06 14:12:00.686 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:00 compute-0 nova_compute[192903]: 2025-10-06 14:12:00.687 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:00 compute-0 nova_compute[192903]: 2025-10-06 14:12:00.687 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:12:00 compute-0 nova_compute[192903]: 2025-10-06 14:12:00.696 2 INFO nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:12:01 compute-0 openstack_network_exporter[205500]: ERROR   14:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:12:01 compute-0 openstack_network_exporter[205500]: ERROR   14:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:12:01 compute-0 openstack_network_exporter[205500]: ERROR   14:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:12:01 compute-0 openstack_network_exporter[205500]: ERROR   14:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:12:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:12:01 compute-0 openstack_network_exporter[205500]: ERROR   14:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:12:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:12:01 compute-0 nova_compute[192903]: 2025-10-06 14:12:01.704 2 INFO nova.virt.libvirt.driver [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:12:01 compute-0 nova_compute[192903]: 2025-10-06 14:12:01.711 2 DEBUG nova.compute.manager [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:12:01 compute-0 nova_compute[192903]: 2025-10-06 14:12:01.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:02 compute-0 nova_compute[192903]: 2025-10-06 14:12:02.221 2 DEBUG nova.objects.instance [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:12:03 compute-0 nova_compute[192903]: 2025-10-06 14:12:03.243 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:12:03 compute-0 nova_compute[192903]: 2025-10-06 14:12:03.952 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:12:03 compute-0 nova_compute[192903]: 2025-10-06 14:12:03.953 2 WARNING neutronclient.v2_0.client [None req-b606224e-3be2-43ec-bb71-a683105beb1a f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:12:03 compute-0 nova_compute[192903]: 2025-10-06 14:12:03.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:06 compute-0 nova_compute[192903]: 2025-10-06 14:12:06.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:08 compute-0 podman[220981]: 2025-10-06 14:12:08.223674408 +0000 UTC m=+0.078280102 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.320 2 DEBUG oslo_concurrency.lockutils [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "35d927dc-a7c1-4457-ae6d-ba716c35b931" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.321 2 DEBUG oslo_concurrency.lockutils [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.321 2 DEBUG oslo_concurrency.lockutils [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.322 2 DEBUG oslo_concurrency.lockutils [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.322 2 DEBUG oslo_concurrency.lockutils [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.336 2 INFO nova.compute.manager [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Terminating instance
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.854 2 DEBUG nova.compute.manager [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:12:08 compute-0 kernel: tap4150e4a6-f7 (unregistering): left promiscuous mode
Oct 06 14:12:08 compute-0 NetworkManager[52035]: <info>  [1759759928.8856] device (tap4150e4a6-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:12:08 compute-0 ovn_controller[95205]: 2025-10-06T14:12:08Z|00119|binding|INFO|Releasing lport 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 from this chassis (sb_readonly=0)
Oct 06 14:12:08 compute-0 ovn_controller[95205]: 2025-10-06T14:12:08Z|00120|binding|INFO|Setting lport 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 down in Southbound
Oct 06 14:12:08 compute-0 ovn_controller[95205]: 2025-10-06T14:12:08Z|00121|binding|INFO|Removing iface tap4150e4a6-f7 ovn-installed in OVS
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:08.908 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:a2:ea 10.100.0.12'], port_security=['fa:16:3e:60:a2:ea 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '35d927dc-a7c1-4457-ae6d-ba716c35b931', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=4150e4a6-f7b6-4478-bdfa-f6179da74cf7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:12:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:08.908 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 4150e4a6-f7b6-4478-bdfa-f6179da74cf7 in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 unbound from our chassis
Oct 06 14:12:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:08.909 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:08.941 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9114f935-c02b-416a-8ac7-f4fb53226ff1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:08 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Oct 06 14:12:08 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Consumed 15.121s CPU time.
Oct 06 14:12:08 compute-0 systemd-machined[152985]: Machine qemu-9-instance-0000000d terminated.
Oct 06 14:12:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:08.981 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[e268df0a-b486-4cf3-85ca-8c3904232f37]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:08.984 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc7c81b-6546-4df2-b2b3-8a1449a6cf3f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:08 compute-0 nova_compute[192903]: 2025-10-06 14:12:08.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:09.022 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[eed59411-d924-49b3-b5d8-11d191d844d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:09.045 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[398539c2-6456-4dff-95e0-a9fb4971a748]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37630f0a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:70:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433989, 'reachable_time': 23205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221013, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:09.068 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e9369315-5905-4ccb-b374-071736e8acfa]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap37630f0a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434004, 'tstamp': 434004}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221014, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap37630f0a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434008, 'tstamp': 434008}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221014, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:09.070 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37630f0a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:09.078 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37630f0a-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:12:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:09.078 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:12:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:09.079 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap37630f0a-80, col_values=(('external_ids', {'iface-id': '01e7ff9b-7072-42b9-b412-c40a88736ea9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:12:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:09.079 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:12:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:09.081 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a13ac1e9-6563-4d69-80b7-da5222eca55d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-37630f0a-8aad-4e9a-8c81-a92f8d673f93\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 37630f0a-8aad-4e9a-8c81-a92f8d673f93\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.121 2 INFO nova.virt.libvirt.driver [-] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Instance destroyed successfully.
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.121 2 DEBUG nova.objects.instance [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lazy-loading 'resources' on Instance uuid 35d927dc-a7c1-4457-ae6d-ba716c35b931 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.400 2 DEBUG nova.compute.manager [req-3df53d04-37b2-494b-849f-09798f298d63 req-cbd03497-83ba-4d85-a150-ec5dd3f32a95 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Received event network-vif-unplugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.401 2 DEBUG oslo_concurrency.lockutils [req-3df53d04-37b2-494b-849f-09798f298d63 req-cbd03497-83ba-4d85-a150-ec5dd3f32a95 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.401 2 DEBUG oslo_concurrency.lockutils [req-3df53d04-37b2-494b-849f-09798f298d63 req-cbd03497-83ba-4d85-a150-ec5dd3f32a95 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.402 2 DEBUG oslo_concurrency.lockutils [req-3df53d04-37b2-494b-849f-09798f298d63 req-cbd03497-83ba-4d85-a150-ec5dd3f32a95 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.402 2 DEBUG nova.compute.manager [req-3df53d04-37b2-494b-849f-09798f298d63 req-cbd03497-83ba-4d85-a150-ec5dd3f32a95 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] No waiting events found dispatching network-vif-unplugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.402 2 DEBUG nova.compute.manager [req-3df53d04-37b2-494b-849f-09798f298d63 req-cbd03497-83ba-4d85-a150-ec5dd3f32a95 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Received event network-vif-unplugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.629 2 DEBUG nova.virt.libvirt.vif [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:11:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1981519713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1981519713',id=13,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:11:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-yg2nc0kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:11:20Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=35d927dc-a7c1-4457-ae6d-ba716c35b931,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.630 2 DEBUG nova.network.os_vif_util [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converting VIF {"id": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "address": "fa:16:3e:60:a2:ea", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4150e4a6-f7", "ovs_interfaceid": "4150e4a6-f7b6-4478-bdfa-f6179da74cf7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.631 2 DEBUG nova.network.os_vif_util [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:a2:ea,bridge_name='br-int',has_traffic_filtering=True,id=4150e4a6-f7b6-4478-bdfa-f6179da74cf7,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4150e4a6-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.632 2 DEBUG os_vif [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:a2:ea,bridge_name='br-int',has_traffic_filtering=True,id=4150e4a6-f7b6-4478-bdfa-f6179da74cf7,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4150e4a6-f7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.636 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4150e4a6-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=aa0d2560-0507-4220-9d4b-a211766b5c37) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.646 2 INFO os_vif [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:a2:ea,bridge_name='br-int',has_traffic_filtering=True,id=4150e4a6-f7b6-4478-bdfa-f6179da74cf7,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4150e4a6-f7')
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.647 2 INFO nova.virt.libvirt.driver [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Deleting instance files /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931_del
Oct 06 14:12:09 compute-0 nova_compute[192903]: 2025-10-06 14:12:09.648 2 INFO nova.virt.libvirt.driver [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Deletion of /var/lib/nova/instances/35d927dc-a7c1-4457-ae6d-ba716c35b931_del complete
Oct 06 14:12:10 compute-0 nova_compute[192903]: 2025-10-06 14:12:10.161 2 INFO nova.compute.manager [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 06 14:12:10 compute-0 nova_compute[192903]: 2025-10-06 14:12:10.162 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:12:10 compute-0 nova_compute[192903]: 2025-10-06 14:12:10.162 2 DEBUG nova.compute.manager [-] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:12:10 compute-0 nova_compute[192903]: 2025-10-06 14:12:10.162 2 DEBUG nova.network.neutron [-] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:12:10 compute-0 nova_compute[192903]: 2025-10-06 14:12:10.163 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:12:10 compute-0 nova_compute[192903]: 2025-10-06 14:12:10.948 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:12:11 compute-0 podman[221032]: 2025-10-06 14:12:11.24338494 +0000 UTC m=+0.095910473 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.260 2 DEBUG nova.compute.manager [req-6a019f86-12a8-407a-9644-e13e6a3689ed req-e404d2b7-b44e-4deb-a552-50180db0f9e5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Received event network-vif-deleted-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.261 2 INFO nova.compute.manager [req-6a019f86-12a8-407a-9644-e13e6a3689ed req-e404d2b7-b44e-4deb-a552-50180db0f9e5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Neutron deleted interface 4150e4a6-f7b6-4478-bdfa-f6179da74cf7; detaching it from the instance and deleting it from the info cache
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.261 2 DEBUG nova.network.neutron [req-6a019f86-12a8-407a-9644-e13e6a3689ed req-e404d2b7-b44e-4deb-a552-50180db0f9e5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:12:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:11.370 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:11.370 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:11.370 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.500 2 DEBUG nova.compute.manager [req-8c3484b6-3da3-47d4-9494-83b8500032f5 req-32395629-ccda-4482-a5b1-71c40a420035 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Received event network-vif-unplugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.501 2 DEBUG oslo_concurrency.lockutils [req-8c3484b6-3da3-47d4-9494-83b8500032f5 req-32395629-ccda-4482-a5b1-71c40a420035 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.503 2 DEBUG oslo_concurrency.lockutils [req-8c3484b6-3da3-47d4-9494-83b8500032f5 req-32395629-ccda-4482-a5b1-71c40a420035 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.504 2 DEBUG oslo_concurrency.lockutils [req-8c3484b6-3da3-47d4-9494-83b8500032f5 req-32395629-ccda-4482-a5b1-71c40a420035 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.504 2 DEBUG nova.compute.manager [req-8c3484b6-3da3-47d4-9494-83b8500032f5 req-32395629-ccda-4482-a5b1-71c40a420035 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] No waiting events found dispatching network-vif-unplugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.505 2 DEBUG nova.compute.manager [req-8c3484b6-3da3-47d4-9494-83b8500032f5 req-32395629-ccda-4482-a5b1-71c40a420035 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Received event network-vif-unplugged-4150e4a6-f7b6-4478-bdfa-f6179da74cf7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.718 2 DEBUG nova.network.neutron [-] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:12:11 compute-0 nova_compute[192903]: 2025-10-06 14:12:11.770 2 DEBUG nova.compute.manager [req-6a019f86-12a8-407a-9644-e13e6a3689ed req-e404d2b7-b44e-4deb-a552-50180db0f9e5 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Detach interface failed, port_id=4150e4a6-f7b6-4478-bdfa-f6179da74cf7, reason: Instance 35d927dc-a7c1-4457-ae6d-ba716c35b931 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:12:12 compute-0 nova_compute[192903]: 2025-10-06 14:12:12.225 2 INFO nova.compute.manager [-] [instance: 35d927dc-a7c1-4457-ae6d-ba716c35b931] Took 2.06 seconds to deallocate network for instance.
Oct 06 14:12:12 compute-0 nova_compute[192903]: 2025-10-06 14:12:12.750 2 DEBUG oslo_concurrency.lockutils [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:12 compute-0 nova_compute[192903]: 2025-10-06 14:12:12.751 2 DEBUG oslo_concurrency.lockutils [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:12 compute-0 nova_compute[192903]: 2025-10-06 14:12:12.817 2 DEBUG nova.compute.provider_tree [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:12:13 compute-0 nova_compute[192903]: 2025-10-06 14:12:13.326 2 DEBUG nova.scheduler.client.report [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:12:13 compute-0 nova_compute[192903]: 2025-10-06 14:12:13.841 2 DEBUG oslo_concurrency.lockutils [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:13 compute-0 nova_compute[192903]: 2025-10-06 14:12:13.864 2 INFO nova.scheduler.client.report [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Deleted allocations for instance 35d927dc-a7c1-4457-ae6d-ba716c35b931
Oct 06 14:12:13 compute-0 nova_compute[192903]: 2025-10-06 14:12:13.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:14 compute-0 nova_compute[192903]: 2025-10-06 14:12:14.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:14 compute-0 nova_compute[192903]: 2025-10-06 14:12:14.898 2 DEBUG oslo_concurrency.lockutils [None req-613c0a7c-b25d-43bd-a1e3-e98889e2674b f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "35d927dc-a7c1-4457-ae6d-ba716c35b931" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.577s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:15 compute-0 nova_compute[192903]: 2025-10-06 14:12:15.984 2 DEBUG oslo_concurrency.lockutils [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "971068bc-edb7-4fe2-8822-6603739b1a9c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:15 compute-0 nova_compute[192903]: 2025-10-06 14:12:15.985 2 DEBUG oslo_concurrency.lockutils [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:15 compute-0 nova_compute[192903]: 2025-10-06 14:12:15.985 2 DEBUG oslo_concurrency.lockutils [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:15 compute-0 nova_compute[192903]: 2025-10-06 14:12:15.986 2 DEBUG oslo_concurrency.lockutils [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:15 compute-0 nova_compute[192903]: 2025-10-06 14:12:15.986 2 DEBUG oslo_concurrency.lockutils [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.002 2 INFO nova.compute.manager [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Terminating instance
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.519 2 DEBUG nova.compute.manager [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:12:16 compute-0 kernel: tap4bfbc46b-9b (unregistering): left promiscuous mode
Oct 06 14:12:16 compute-0 NetworkManager[52035]: <info>  [1759759936.5512] device (tap4bfbc46b-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00122|binding|INFO|Releasing lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 from this chassis (sb_readonly=0)
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00123|binding|INFO|Setting lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 down in Southbound
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00124|binding|INFO|Removing iface tap4bfbc46b-9b ovn-installed in OVS
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.568 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:2d:58 10.100.0.5'], port_security=['fa:16:3e:ae:2d:58 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '971068bc-edb7-4fe2-8822-6603739b1a9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=4bfbc46b-9b1f-4bfa-83da-c52787c02064) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.571 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 4bfbc46b-9b1f-4bfa-83da-c52787c02064 in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 unbound from our chassis
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.572 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.573 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c07616b6-6113-4e3a-ac9d-c948f5fc2d6c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.574 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 namespace which is not needed anymore
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:16 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Oct 06 14:12:16 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Consumed 2.794s CPU time.
Oct 06 14:12:16 compute-0 systemd-machined[152985]: Machine qemu-10-instance-0000000c terminated.
Oct 06 14:12:16 compute-0 kernel: tap4bfbc46b-9b: entered promiscuous mode
Oct 06 14:12:16 compute-0 NetworkManager[52035]: <info>  [1759759936.7435] manager: (tap4bfbc46b-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00125|binding|INFO|Claiming lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 for this chassis.
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00126|binding|INFO|4bfbc46b-9b1f-4bfa-83da-c52787c02064: Claiming fa:16:3e:ae:2d:58 10.100.0.5
Oct 06 14:12:16 compute-0 kernel: tap4bfbc46b-9b (unregistering): left promiscuous mode
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.749 2 DEBUG nova.compute.manager [req-83fef678-ffbb-4dea-81cc-454d07d76928 req-2166664b-a538-43aa-978a-d380e8aafb61 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.750 2 DEBUG oslo_concurrency.lockutils [req-83fef678-ffbb-4dea-81cc-454d07d76928 req-2166664b-a538-43aa-978a-d380e8aafb61 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.751 2 DEBUG oslo_concurrency.lockutils [req-83fef678-ffbb-4dea-81cc-454d07d76928 req-2166664b-a538-43aa-978a-d380e8aafb61 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.751 2 DEBUG oslo_concurrency.lockutils [req-83fef678-ffbb-4dea-81cc-454d07d76928 req-2166664b-a538-43aa-978a-d380e8aafb61 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.752 2 DEBUG nova.compute.manager [req-83fef678-ffbb-4dea-81cc-454d07d76928 req-2166664b-a538-43aa-978a-d380e8aafb61 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] No waiting events found dispatching network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.752 2 DEBUG nova.compute.manager [req-83fef678-ffbb-4dea-81cc-454d07d76928 req-2166664b-a538-43aa-978a-d380e8aafb61 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.754 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:2d:58 10.100.0.5'], port_security=['fa:16:3e:ae:2d:58 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '971068bc-edb7-4fe2-8822-6603739b1a9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=4bfbc46b-9b1f-4bfa-83da-c52787c02064) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:12:16 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[220635]: [NOTICE]   (220639) : haproxy version is 3.0.5-8e879a5
Oct 06 14:12:16 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[220635]: [NOTICE]   (220639) : path to executable is /usr/sbin/haproxy
Oct 06 14:12:16 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[220635]: [WARNING]  (220639) : Exiting Master process...
Oct 06 14:12:16 compute-0 podman[221086]: 2025-10-06 14:12:16.778451573 +0000 UTC m=+0.073764651 container kill a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00127|binding|INFO|Setting lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 ovn-installed in OVS
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00128|binding|INFO|Setting lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 up in Southbound
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00129|binding|INFO|Releasing lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 from this chassis (sb_readonly=1)
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00130|if_status|INFO|Not setting lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 down as sb is readonly
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00131|binding|INFO|Removing iface tap4bfbc46b-9b ovn-installed in OVS
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00132|binding|INFO|Releasing lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 from this chassis (sb_readonly=0)
Oct 06 14:12:16 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[220635]: [ALERT]    (220639) : Current worker (220641) exited with code 143 (Terminated)
Oct 06 14:12:16 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[220635]: [WARNING]  (220639) : All workers exited. Exiting... (0)
Oct 06 14:12:16 compute-0 ovn_controller[95205]: 2025-10-06T14:12:16Z|00133|binding|INFO|Setting lport 4bfbc46b-9b1f-4bfa-83da-c52787c02064 down in Southbound
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.792 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:2d:58 10.100.0.5'], port_security=['fa:16:3e:ae:2d:58 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '971068bc-edb7-4fe2-8822-6603739b1a9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=4bfbc46b-9b1f-4bfa-83da-c52787c02064) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:12:16 compute-0 systemd[1]: libpod-a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298.scope: Deactivated successfully.
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.836 2 INFO nova.virt.libvirt.driver [-] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Instance destroyed successfully.
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.836 2 DEBUG nova.objects.instance [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lazy-loading 'resources' on Instance uuid 971068bc-edb7-4fe2-8822-6603739b1a9c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:12:16 compute-0 podman[221116]: 2025-10-06 14:12:16.861571723 +0000 UTC m=+0.036766323 container died a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:12:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298-userdata-shm.mount: Deactivated successfully.
Oct 06 14:12:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-ca27a79e60c842f1ecc54b46c3746d0509c8f0edd1effc174f96abbc0b8a0d9c-merged.mount: Deactivated successfully.
Oct 06 14:12:16 compute-0 podman[221116]: 2025-10-06 14:12:16.915501723 +0000 UTC m=+0.090696293 container remove a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.922 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[af2ecb3d-27e7-4809-860d-bf2fd063a816]: (4, ("Mon Oct  6 02:12:16 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 (a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298)\na6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298\nMon Oct  6 02:12:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 (a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298)\na6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.924 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[58cd6534-8208-452b-88d4-bb2fdf61e119]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:16 compute-0 systemd[1]: libpod-conmon-a6c04021afa64aa80966432c44c1a2848897ba841bafaff6a41824f8286c5298.scope: Deactivated successfully.
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.925 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.925 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[806bea64-f69b-4943-ad68-61ac8c9018b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.926 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37630f0a-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:16 compute-0 kernel: tap37630f0a-80: left promiscuous mode
Oct 06 14:12:16 compute-0 nova_compute[192903]: 2025-10-06 14:12:16.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.951 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6f521ba7-53da-409b-b525-72d5b8e03234]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.988 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3242baf7-12d2-43a6-abc2-4b6d0ebdcf20]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:16.989 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9f2d8fb0-5c96-4edc-939b-2e48e1b4a883]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:17.009 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[70482a79-af66-4c29-a1ce-7218622dc863]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433979, 'reachable_time': 43111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221148, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:17.013 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:12:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:17.013 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[8a9f0536-e7d5-4913-a9d7-9b7fc7bdea23]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:17.014 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 4bfbc46b-9b1f-4bfa-83da-c52787c02064 in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 unbound from our chassis
Oct 06 14:12:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d37630f0a\x2d8aad\x2d4e9a\x2d8c81\x2da92f8d673f93.mount: Deactivated successfully.
Oct 06 14:12:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:17.015 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:12:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:17.016 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[cac24742-126e-4303-a1f7-5623e0a8b7d4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:17.017 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 4bfbc46b-9b1f-4bfa-83da-c52787c02064 in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 unbound from our chassis
Oct 06 14:12:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:17.018 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:12:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:17.019 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad1bddc-8bec-4da3-84e4-16da312f407b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.343 2 DEBUG nova.virt.libvirt.vif [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:10:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-669739048',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-669739048',id=12,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:10:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-qla3cmqy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:12:02Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=971068bc-edb7-4fe2-8822-6603739b1a9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "address": "fa:16:3e:ae:2d:58", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bfbc46b-9b", "ovs_interfaceid": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.344 2 DEBUG nova.network.os_vif_util [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converting VIF {"id": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "address": "fa:16:3e:ae:2d:58", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bfbc46b-9b", "ovs_interfaceid": "4bfbc46b-9b1f-4bfa-83da-c52787c02064", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.345 2 DEBUG nova.network.os_vif_util [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ae:2d:58,bridge_name='br-int',has_traffic_filtering=True,id=4bfbc46b-9b1f-4bfa-83da-c52787c02064,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bfbc46b-9b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.346 2 DEBUG os_vif [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:2d:58,bridge_name='br-int',has_traffic_filtering=True,id=4bfbc46b-9b1f-4bfa-83da-c52787c02064,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bfbc46b-9b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.349 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bfbc46b-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.354 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=22f12747-1cf9-4421-9698-dbca63d5be19) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.359 2 INFO os_vif [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:2d:58,bridge_name='br-int',has_traffic_filtering=True,id=4bfbc46b-9b1f-4bfa-83da-c52787c02064,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bfbc46b-9b')
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.360 2 INFO nova.virt.libvirt.driver [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Deleting instance files /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c_del
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.361 2 INFO nova.virt.libvirt.driver [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Deletion of /var/lib/nova/instances/971068bc-edb7-4fe2-8822-6603739b1a9c_del complete
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.882 2 INFO nova.compute.manager [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Took 1.36 seconds to destroy the instance on the hypervisor.
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.883 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.883 2 DEBUG nova.compute.manager [-] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.884 2 DEBUG nova.network.neutron [-] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:12:17 compute-0 nova_compute[192903]: 2025-10-06 14:12:17.884 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.812 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.813 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.814 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.814 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.815 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] No waiting events found dispatching network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.815 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.816 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-plugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.816 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.817 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.817 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.817 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] No waiting events found dispatching network-vif-plugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.818 2 WARNING nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received unexpected event network-vif-plugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 for instance with vm_state active and task_state deleting.
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.818 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-plugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.819 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.819 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.820 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.820 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] No waiting events found dispatching network-vif-plugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.821 2 WARNING nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received unexpected event network-vif-plugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 for instance with vm_state active and task_state deleting.
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.821 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.822 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.822 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.823 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.823 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] No waiting events found dispatching network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.824 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.824 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.825 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.825 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.826 2 DEBUG oslo_concurrency.lockutils [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.826 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] No waiting events found dispatching network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.827 2 DEBUG nova.compute.manager [req-e3fd5f5a-a404-4dad-9695-b2dae99ad989 req-c7bce3eb-3fda-41ec-b7cd-6523f40e62c4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-unplugged-4bfbc46b-9b1f-4bfa-83da-c52787c02064 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.966 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:12:18 compute-0 nova_compute[192903]: 2025-10-06 14:12:18.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:19 compute-0 nova_compute[192903]: 2025-10-06 14:12:19.879 2 DEBUG nova.compute.manager [req-5b1d8c2a-eba3-4346-b8f6-8e439a3b3595 req-590a1b80-123e-45c8-82ff-fb2b4c2162cb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Received event network-vif-deleted-4bfbc46b-9b1f-4bfa-83da-c52787c02064 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:12:19 compute-0 nova_compute[192903]: 2025-10-06 14:12:19.880 2 INFO nova.compute.manager [req-5b1d8c2a-eba3-4346-b8f6-8e439a3b3595 req-590a1b80-123e-45c8-82ff-fb2b4c2162cb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Neutron deleted interface 4bfbc46b-9b1f-4bfa-83da-c52787c02064; detaching it from the instance and deleting it from the info cache
Oct 06 14:12:19 compute-0 nova_compute[192903]: 2025-10-06 14:12:19.880 2 DEBUG nova.network.neutron [req-5b1d8c2a-eba3-4346-b8f6-8e439a3b3595 req-590a1b80-123e-45c8-82ff-fb2b4c2162cb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:12:20 compute-0 nova_compute[192903]: 2025-10-06 14:12:20.339 2 DEBUG nova.network.neutron [-] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:12:20 compute-0 nova_compute[192903]: 2025-10-06 14:12:20.387 2 DEBUG nova.compute.manager [req-5b1d8c2a-eba3-4346-b8f6-8e439a3b3595 req-590a1b80-123e-45c8-82ff-fb2b4c2162cb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Detach interface failed, port_id=4bfbc46b-9b1f-4bfa-83da-c52787c02064, reason: Instance 971068bc-edb7-4fe2-8822-6603739b1a9c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:12:20 compute-0 nova_compute[192903]: 2025-10-06 14:12:20.847 2 INFO nova.compute.manager [-] [instance: 971068bc-edb7-4fe2-8822-6603739b1a9c] Took 2.96 seconds to deallocate network for instance.
Oct 06 14:12:21 compute-0 nova_compute[192903]: 2025-10-06 14:12:21.370 2 DEBUG oslo_concurrency.lockutils [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:21 compute-0 nova_compute[192903]: 2025-10-06 14:12:21.370 2 DEBUG oslo_concurrency.lockutils [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:21 compute-0 nova_compute[192903]: 2025-10-06 14:12:21.375 2 DEBUG oslo_concurrency.lockutils [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:21 compute-0 nova_compute[192903]: 2025-10-06 14:12:21.401 2 INFO nova.scheduler.client.report [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Deleted allocations for instance 971068bc-edb7-4fe2-8822-6603739b1a9c
Oct 06 14:12:22 compute-0 nova_compute[192903]: 2025-10-06 14:12:22.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:22 compute-0 nova_compute[192903]: 2025-10-06 14:12:22.428 2 DEBUG oslo_concurrency.lockutils [None req-43b75fc0-8a85-4f7e-b184-548c3107db45 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "971068bc-edb7-4fe2-8822-6603739b1a9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.442s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:23 compute-0 nova_compute[192903]: 2025-10-06 14:12:23.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:27 compute-0 podman[221149]: 2025-10-06 14:12:27.257755115 +0000 UTC m=+0.107450692 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:12:27 compute-0 nova_compute[192903]: 2025-10-06 14:12:27.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:29 compute-0 nova_compute[192903]: 2025-10-06 14:12:29.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:29 compute-0 podman[221176]: 2025-10-06 14:12:29.242639131 +0000 UTC m=+0.085885795 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930)
Oct 06 14:12:29 compute-0 podman[221175]: 2025-10-06 14:12:29.2527352 +0000 UTC m=+0.101629454 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 06 14:12:29 compute-0 podman[221174]: 2025-10-06 14:12:29.287366885 +0000 UTC m=+0.142875366 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 06 14:12:29 compute-0 podman[203308]: time="2025-10-06T14:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:12:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:12:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3007 "" "Go-http-client/1.1"
Oct 06 14:12:31 compute-0 openstack_network_exporter[205500]: ERROR   14:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:12:31 compute-0 openstack_network_exporter[205500]: ERROR   14:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:12:31 compute-0 openstack_network_exporter[205500]: ERROR   14:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:12:31 compute-0 openstack_network_exporter[205500]: ERROR   14:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:12:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:12:31 compute-0 openstack_network_exporter[205500]: ERROR   14:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:12:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:12:32 compute-0 nova_compute[192903]: 2025-10-06 14:12:32.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:34 compute-0 nova_compute[192903]: 2025-10-06 14:12:34.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:37 compute-0 nova_compute[192903]: 2025-10-06 14:12:37.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:39 compute-0 nova_compute[192903]: 2025-10-06 14:12:39.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:39 compute-0 podman[221238]: 2025-10-06 14:12:39.232596873 +0000 UTC m=+0.090010595 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:12:42 compute-0 podman[221258]: 2025-10-06 14:12:42.241254269 +0000 UTC m=+0.096459406 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, io.openshift.expose-services=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Oct 06 14:12:42 compute-0 nova_compute[192903]: 2025-10-06 14:12:42.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:44 compute-0 nova_compute[192903]: 2025-10-06 14:12:44.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:47 compute-0 nova_compute[192903]: 2025-10-06 14:12:47.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:49 compute-0 nova_compute[192903]: 2025-10-06 14:12:49.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:50 compute-0 nova_compute[192903]: 2025-10-06 14:12:50.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:52 compute-0 nova_compute[192903]: 2025-10-06 14:12:52.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:54 compute-0 nova_compute[192903]: 2025-10-06 14:12:54.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:54 compute-0 nova_compute[192903]: 2025-10-06 14:12:54.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.094 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:55 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:55.147 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:12:55 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:12:55.148 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.274 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.276 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.312 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.313 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5873MB free_disk=73.30220413208008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.313 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:12:55 compute-0 nova_compute[192903]: 2025-10-06 14:12:55.313 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:12:56 compute-0 nova_compute[192903]: 2025-10-06 14:12:56.356 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:12:56 compute-0 nova_compute[192903]: 2025-10-06 14:12:56.356 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:12:55 up  1:13,  0 user,  load average: 0.56, 0.48, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:12:56 compute-0 nova_compute[192903]: 2025-10-06 14:12:56.380 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:12:56 compute-0 nova_compute[192903]: 2025-10-06 14:12:56.887 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:12:57 compute-0 nova_compute[192903]: 2025-10-06 14:12:57.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:57 compute-0 nova_compute[192903]: 2025-10-06 14:12:57.401 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:12:57 compute-0 nova_compute[192903]: 2025-10-06 14:12:57.401 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:12:58 compute-0 podman[221282]: 2025-10-06 14:12:58.226907579 +0000 UTC m=+0.085329329 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:12:58 compute-0 nova_compute[192903]: 2025-10-06 14:12:58.397 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:58 compute-0 nova_compute[192903]: 2025-10-06 14:12:58.398 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:58 compute-0 nova_compute[192903]: 2025-10-06 14:12:58.910 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:58 compute-0 nova_compute[192903]: 2025-10-06 14:12:58.911 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:58 compute-0 nova_compute[192903]: 2025-10-06 14:12:58.912 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:12:58 compute-0 nova_compute[192903]: 2025-10-06 14:12:58.912 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:12:59 compute-0 nova_compute[192903]: 2025-10-06 14:12:59.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:12:59 compute-0 podman[203308]: time="2025-10-06T14:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:12:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:12:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3007 "" "Go-http-client/1.1"
Oct 06 14:13:00 compute-0 nova_compute[192903]: 2025-10-06 14:13:00.097 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:00 compute-0 nova_compute[192903]: 2025-10-06 14:13:00.098 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:00 compute-0 podman[221309]: 2025-10-06 14:13:00.238954059 +0000 UTC m=+0.089744008 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 06 14:13:00 compute-0 podman[221308]: 2025-10-06 14:13:00.238849086 +0000 UTC m=+0.095575953 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:13:00 compute-0 podman[221307]: 2025-10-06 14:13:00.2951525 +0000 UTC m=+0.157793525 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true)
Oct 06 14:13:00 compute-0 nova_compute[192903]: 2025-10-06 14:13:00.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:13:00 compute-0 nova_compute[192903]: 2025-10-06 14:13:00.605 2 DEBUG nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:13:01 compute-0 nova_compute[192903]: 2025-10-06 14:13:01.147 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:01 compute-0 nova_compute[192903]: 2025-10-06 14:13:01.147 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:01 compute-0 nova_compute[192903]: 2025-10-06 14:13:01.156 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:13:01 compute-0 nova_compute[192903]: 2025-10-06 14:13:01.156 2 INFO nova.compute.claims [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:13:01 compute-0 openstack_network_exporter[205500]: ERROR   14:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:13:01 compute-0 openstack_network_exporter[205500]: ERROR   14:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:13:01 compute-0 openstack_network_exporter[205500]: ERROR   14:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:13:01 compute-0 openstack_network_exporter[205500]: ERROR   14:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:13:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:13:01 compute-0 openstack_network_exporter[205500]: ERROR   14:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:13:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:13:02 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:02.150 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:02 compute-0 nova_compute[192903]: 2025-10-06 14:13:02.228 2 DEBUG nova.compute.provider_tree [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:13:02 compute-0 nova_compute[192903]: 2025-10-06 14:13:02.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:02 compute-0 nova_compute[192903]: 2025-10-06 14:13:02.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:13:02 compute-0 nova_compute[192903]: 2025-10-06 14:13:02.737 2 DEBUG nova.scheduler.client.report [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:13:03 compute-0 nova_compute[192903]: 2025-10-06 14:13:03.249 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:03 compute-0 nova_compute[192903]: 2025-10-06 14:13:03.250 2 DEBUG nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:13:03 compute-0 nova_compute[192903]: 2025-10-06 14:13:03.762 2 DEBUG nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:13:03 compute-0 nova_compute[192903]: 2025-10-06 14:13:03.762 2 DEBUG nova.network.neutron [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:13:03 compute-0 nova_compute[192903]: 2025-10-06 14:13:03.763 2 WARNING neutronclient.v2_0.client [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:03 compute-0 nova_compute[192903]: 2025-10-06 14:13:03.764 2 WARNING neutronclient.v2_0.client [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:04 compute-0 nova_compute[192903]: 2025-10-06 14:13:04.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:04 compute-0 nova_compute[192903]: 2025-10-06 14:13:04.272 2 INFO nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:13:04 compute-0 nova_compute[192903]: 2025-10-06 14:13:04.781 2 DEBUG nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:13:04 compute-0 nova_compute[192903]: 2025-10-06 14:13:04.961 2 DEBUG nova.network.neutron [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Successfully created port: 49d1791e-0799-488d-b6c7-50cd175f0414 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.804 2 DEBUG nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.806 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.806 2 INFO nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Creating image(s)
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.807 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "/var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.808 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "/var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.809 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "/var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.810 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.816 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.818 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.904 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.905 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.906 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.908 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.914 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:13:05 compute-0 nova_compute[192903]: 2025-10-06 14:13:05.915 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.014 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.015 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.065 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.067 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.161s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.068 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.156 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.158 2 DEBUG nova.virt.disk.api [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Checking if we can resize image /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.158 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.177 2 DEBUG nova.network.neutron [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Successfully updated port: 49d1791e-0799-488d-b6c7-50cd175f0414 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.224 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.225 2 DEBUG nova.virt.disk.api [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Cannot resize image /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.225 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.226 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Ensure instance console log exists: /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.227 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.227 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.228 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.246 2 DEBUG nova.compute.manager [req-65c28e60-17d1-472a-a2ca-bb00c2e23789 req-da4a7b67-8393-42d6-883a-fcb7292b55e1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Received event network-changed-49d1791e-0799-488d-b6c7-50cd175f0414 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.247 2 DEBUG nova.compute.manager [req-65c28e60-17d1-472a-a2ca-bb00c2e23789 req-da4a7b67-8393-42d6-883a-fcb7292b55e1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Refreshing instance network info cache due to event network-changed-49d1791e-0799-488d-b6c7-50cd175f0414. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.247 2 DEBUG oslo_concurrency.lockutils [req-65c28e60-17d1-472a-a2ca-bb00c2e23789 req-da4a7b67-8393-42d6-883a-fcb7292b55e1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-cd58c0d7-99da-40d0-b5df-9a3dbac42360" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.248 2 DEBUG oslo_concurrency.lockutils [req-65c28e60-17d1-472a-a2ca-bb00c2e23789 req-da4a7b67-8393-42d6-883a-fcb7292b55e1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-cd58c0d7-99da-40d0-b5df-9a3dbac42360" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.248 2 DEBUG nova.network.neutron [req-65c28e60-17d1-472a-a2ca-bb00c2e23789 req-da4a7b67-8393-42d6-883a-fcb7292b55e1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Refreshing network info cache for port 49d1791e-0799-488d-b6c7-50cd175f0414 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.687 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "refresh_cache-cd58c0d7-99da-40d0-b5df-9a3dbac42360" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.755 2 WARNING neutronclient.v2_0.client [req-65c28e60-17d1-472a-a2ca-bb00c2e23789 req-da4a7b67-8393-42d6-883a-fcb7292b55e1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:06 compute-0 nova_compute[192903]: 2025-10-06 14:13:06.963 2 DEBUG nova.network.neutron [req-65c28e60-17d1-472a-a2ca-bb00c2e23789 req-da4a7b67-8393-42d6-883a-fcb7292b55e1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:13:07 compute-0 nova_compute[192903]: 2025-10-06 14:13:07.131 2 DEBUG nova.network.neutron [req-65c28e60-17d1-472a-a2ca-bb00c2e23789 req-da4a7b67-8393-42d6-883a-fcb7292b55e1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:13:07 compute-0 nova_compute[192903]: 2025-10-06 14:13:07.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:07 compute-0 nova_compute[192903]: 2025-10-06 14:13:07.639 2 DEBUG oslo_concurrency.lockutils [req-65c28e60-17d1-472a-a2ca-bb00c2e23789 req-da4a7b67-8393-42d6-883a-fcb7292b55e1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-cd58c0d7-99da-40d0-b5df-9a3dbac42360" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:13:07 compute-0 nova_compute[192903]: 2025-10-06 14:13:07.641 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquired lock "refresh_cache-cd58c0d7-99da-40d0-b5df-9a3dbac42360" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:13:07 compute-0 nova_compute[192903]: 2025-10-06 14:13:07.641 2 DEBUG nova.network.neutron [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:13:08 compute-0 nova_compute[192903]: 2025-10-06 14:13:08.976 2 DEBUG nova.network.neutron [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:13:09 compute-0 nova_compute[192903]: 2025-10-06 14:13:09.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:09 compute-0 nova_compute[192903]: 2025-10-06 14:13:09.227 2 WARNING neutronclient.v2_0.client [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.046 2 DEBUG nova.network.neutron [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Updating instance_info_cache with network_info: [{"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:13:10 compute-0 podman[221382]: 2025-10-06 14:13:10.225112044 +0000 UTC m=+0.081407049 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.554 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Releasing lock "refresh_cache-cd58c0d7-99da-40d0-b5df-9a3dbac42360" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.555 2 DEBUG nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Instance network_info: |[{"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.558 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Start _get_guest_xml network_info=[{"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.564 2 WARNING nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.566 2 DEBUG nova.virt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-838001624', uuid='cd58c0d7-99da-40d0-b5df-9a3dbac42360'), owner=OwnerMeta(userid='f242e9aec50346eaa7b3bddbda127d84', username='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin', projectid='58ece9e5771a44c2918fd8f7783186f0', projectname='tempest-TestExecuteHostMaintenanceStrategy-251874218'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759759990.5664217) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.571 2 DEBUG nova.virt.libvirt.host [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.572 2 DEBUG nova.virt.libvirt.host [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.575 2 DEBUG nova.virt.libvirt.host [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.576 2 DEBUG nova.virt.libvirt.host [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.576 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.577 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.577 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.578 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.578 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.578 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.578 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.579 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.579 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.580 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.580 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.581 2 DEBUG nova.virt.hardware [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.587 2 DEBUG nova.virt.libvirt.vif [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:12:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-838001624',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-838001624',id=15,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-upqtbhcd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:13:04Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=cd58c0d7-99da-40d0-b5df-9a3dbac42360,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.587 2 DEBUG nova.network.os_vif_util [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converting VIF {"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.589 2 DEBUG nova.network.os_vif_util [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:ed:4b,bridge_name='br-int',has_traffic_filtering=True,id=49d1791e-0799-488d-b6c7-50cd175f0414,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49d1791e-07') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:13:10 compute-0 nova_compute[192903]: 2025-10-06 14:13:10.590 2 DEBUG nova.objects.instance [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lazy-loading 'pci_devices' on Instance uuid cd58c0d7-99da-40d0-b5df-9a3dbac42360 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.101 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <uuid>cd58c0d7-99da-40d0-b5df-9a3dbac42360</uuid>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <name>instance-0000000f</name>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-838001624</nova:name>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:13:10</nova:creationTime>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:13:11 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:13:11 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:user uuid="f242e9aec50346eaa7b3bddbda127d84">tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin</nova:user>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:project uuid="58ece9e5771a44c2918fd8f7783186f0">tempest-TestExecuteHostMaintenanceStrategy-251874218</nova:project>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         <nova:port uuid="49d1791e-0799-488d-b6c7-50cd175f0414">
Oct 06 14:13:11 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <system>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <entry name="serial">cd58c0d7-99da-40d0-b5df-9a3dbac42360</entry>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <entry name="uuid">cd58c0d7-99da-40d0-b5df-9a3dbac42360</entry>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     </system>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <os>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   </os>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <features>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   </features>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk.config"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:35:ed:4b"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <target dev="tap49d1791e-07"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/console.log" append="off"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <video>
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     </video>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:13:11 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:13:11 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:13:11 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:13:11 compute-0 nova_compute[192903]: </domain>
Oct 06 14:13:11 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.103 2 DEBUG nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Preparing to wait for external event network-vif-plugged-49d1791e-0799-488d-b6c7-50cd175f0414 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.103 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.104 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.104 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.105 2 DEBUG nova.virt.libvirt.vif [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:12:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-838001624',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-838001624',id=15,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-upqtbhcd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:13:04Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=cd58c0d7-99da-40d0-b5df-9a3dbac42360,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.106 2 DEBUG nova.network.os_vif_util [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converting VIF {"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.107 2 DEBUG nova.network.os_vif_util [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:ed:4b,bridge_name='br-int',has_traffic_filtering=True,id=49d1791e-0799-488d-b6c7-50cd175f0414,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49d1791e-07') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.107 2 DEBUG os_vif [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:ed:4b,bridge_name='br-int',has_traffic_filtering=True,id=49d1791e-0799-488d-b6c7-50cd175f0414,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49d1791e-07') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.108 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.109 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.110 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '646d1827-d40e-594a-9ba5-a025f9749513', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.119 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap49d1791e-07, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.120 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap49d1791e-07, col_values=(('qos', UUID('1777c7e6-7c88-446a-9afd-0d3fa7d1fc94')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.120 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap49d1791e-07, col_values=(('external_ids', {'iface-id': '49d1791e-0799-488d-b6c7-50cd175f0414', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:ed:4b', 'vm-uuid': 'cd58c0d7-99da-40d0-b5df-9a3dbac42360'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:11 compute-0 NetworkManager[52035]: <info>  [1759759991.1234] manager: (tap49d1791e-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:11 compute-0 nova_compute[192903]: 2025-10-06 14:13:11.132 2 INFO os_vif [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:ed:4b,bridge_name='br-int',has_traffic_filtering=True,id=49d1791e-0799-488d-b6c7-50cd175f0414,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49d1791e-07')
Oct 06 14:13:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:11.371 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:11.372 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:11.372 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:12 compute-0 nova_compute[192903]: 2025-10-06 14:13:12.679 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:13:12 compute-0 nova_compute[192903]: 2025-10-06 14:13:12.679 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:13:12 compute-0 nova_compute[192903]: 2025-10-06 14:13:12.680 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] No VIF found with MAC fa:16:3e:35:ed:4b, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:13:12 compute-0 nova_compute[192903]: 2025-10-06 14:13:12.680 2 INFO nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Using config drive
Oct 06 14:13:13 compute-0 nova_compute[192903]: 2025-10-06 14:13:13.190 2 WARNING neutronclient.v2_0.client [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:13 compute-0 podman[221405]: 2025-10-06 14:13:13.195097108 +0000 UTC m=+0.063077856 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public)
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.095 2 INFO nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Creating config drive at /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk.config
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.100 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpzftoxvsy execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.240 2 DEBUG oslo_concurrency.processutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpzftoxvsy" returned: 0 in 0.140s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:14 compute-0 kernel: tap49d1791e-07: entered promiscuous mode
Oct 06 14:13:14 compute-0 NetworkManager[52035]: <info>  [1759759994.3166] manager: (tap49d1791e-07): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Oct 06 14:13:14 compute-0 ovn_controller[95205]: 2025-10-06T14:13:14Z|00134|binding|INFO|Claiming lport 49d1791e-0799-488d-b6c7-50cd175f0414 for this chassis.
Oct 06 14:13:14 compute-0 ovn_controller[95205]: 2025-10-06T14:13:14Z|00135|binding|INFO|49d1791e-0799-488d-b6c7-50cd175f0414: Claiming fa:16:3e:35:ed:4b 10.100.0.14
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.324 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:ed:4b 10.100.0.14'], port_security=['fa:16:3e:35:ed:4b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'cd58c0d7-99da-40d0-b5df-9a3dbac42360', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=49d1791e-0799-488d-b6c7-50cd175f0414) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.325 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 49d1791e-0799-488d-b6c7-50cd175f0414 in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 bound to our chassis
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.326 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:13:14 compute-0 ovn_controller[95205]: 2025-10-06T14:13:14Z|00136|binding|INFO|Setting lport 49d1791e-0799-488d-b6c7-50cd175f0414 ovn-installed in OVS
Oct 06 14:13:14 compute-0 ovn_controller[95205]: 2025-10-06T14:13:14Z|00137|binding|INFO|Setting lport 49d1791e-0799-488d-b6c7-50cd175f0414 up in Southbound
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.344 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bd431ec3-b216-4af6-b301-d7e34b54abc4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.345 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap37630f0a-81 in ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:13:14 compute-0 systemd-udevd[221443]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.347 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap37630f0a-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.348 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[99672b90-eee5-476d-b996-ed1b9260a52b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.349 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9a8df8c5-cf92-457f-a8f5-be9b32ce11f8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:14 compute-0 NetworkManager[52035]: <info>  [1759759994.3623] device (tap49d1791e-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:13:14 compute-0 NetworkManager[52035]: <info>  [1759759994.3631] device (tap49d1791e-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.362 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[9ecf25ac-8a5e-4535-a936-4e4d3790cd13]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.369 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8c92941e-1b67-4402-b5ad-1310cfdc7250]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 systemd-machined[152985]: New machine qemu-11-instance-0000000f.
Oct 06 14:13:14 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.407 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2f7e75-12ec-45f9-b211-8d7e13d6408d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.411 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec49474-f0eb-464b-83b0-6ec37939825e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 NetworkManager[52035]: <info>  [1759759994.4125] manager: (tap37630f0a-80): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.455 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[dbaaee9b-507b-4f91-bb4e-34b7fdef450a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.458 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[40cf5c88-22c0-4ced-86dd-1dfda06ef874]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 NetworkManager[52035]: <info>  [1759759994.4837] device (tap37630f0a-80): carrier: link connected
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.493 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[b6696ed3-f0e9-447d-be12-c4e2793fd701]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.518 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4762cedd-6656-4797-9d21-7425d48a2626]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37630f0a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:70:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445612, 'reachable_time': 44848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221478, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.542 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9c7717-8a25-4beb-a942-d74bc63315cb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee0:7095'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445612, 'tstamp': 445612}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221479, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.565 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a6706429-9fee-4156-bc21-3e8e1d7abce9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37630f0a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:70:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445612, 'reachable_time': 44848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221480, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.601 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8bd45c-3601-42fa-94a1-b5b656d92f13]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.661 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9cab4aec-00c4-4eae-95c2-464a66c486bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.662 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37630f0a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.663 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.663 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37630f0a-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:14 compute-0 NetworkManager[52035]: <info>  [1759759994.6651] manager: (tap37630f0a-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:14 compute-0 kernel: tap37630f0a-80: entered promiscuous mode
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.668 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap37630f0a-80, col_values=(('external_ids', {'iface-id': '01e7ff9b-7072-42b9-b412-c40a88736ea9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:14 compute-0 ovn_controller[95205]: 2025-10-06T14:13:14Z|00138|binding|INFO|Releasing lport 01e7ff9b-7072-42b9-b412-c40a88736ea9 from this chassis (sb_readonly=0)
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:14 compute-0 nova_compute[192903]: 2025-10-06 14:13:14.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.694 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[def434d0-c2d6-4705-a363-db5d7bada98c]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.695 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.695 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.695 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 37630f0a-8aad-4e9a-8c81-a92f8d673f93 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.695 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.696 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a260bbe6-15c2-4656-b081-990967caff04]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.697 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.697 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[238384fa-eb39-4ed0-affa-1efaf5ad1ef4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.698 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:13:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:14.703 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'env', 'PROCESS_TAG=haproxy-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/37630f0a-8aad-4e9a-8c81-a92f8d673f93.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:13:15 compute-0 podman[221519]: 2025-10-06 14:13:15.108356846 +0000 UTC m=+0.054721955 container create e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Oct 06 14:13:15 compute-0 systemd[1]: Started libpod-conmon-e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8.scope.
Oct 06 14:13:15 compute-0 podman[221519]: 2025-10-06 14:13:15.079354001 +0000 UTC m=+0.025719130 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:13:15 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:13:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9405d394655f38f237084b123667cad6fae7c6082bb89166b63947340d00f9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.204 2 DEBUG nova.compute.manager [req-0fe71263-4426-4250-93eb-f4b41ba091a9 req-ce829873-5d1f-4d69-90bb-a7d7c58aeff7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Received event network-vif-plugged-49d1791e-0799-488d-b6c7-50cd175f0414 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:13:15 compute-0 podman[221519]: 2025-10-06 14:13:15.204848403 +0000 UTC m=+0.151213522 container init e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.205 2 DEBUG oslo_concurrency.lockutils [req-0fe71263-4426-4250-93eb-f4b41ba091a9 req-ce829873-5d1f-4d69-90bb-a7d7c58aeff7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.205 2 DEBUG oslo_concurrency.lockutils [req-0fe71263-4426-4250-93eb-f4b41ba091a9 req-ce829873-5d1f-4d69-90bb-a7d7c58aeff7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.205 2 DEBUG oslo_concurrency.lockutils [req-0fe71263-4426-4250-93eb-f4b41ba091a9 req-ce829873-5d1f-4d69-90bb-a7d7c58aeff7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.205 2 DEBUG nova.compute.manager [req-0fe71263-4426-4250-93eb-f4b41ba091a9 req-ce829873-5d1f-4d69-90bb-a7d7c58aeff7 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Processing event network-vif-plugged-49d1791e-0799-488d-b6c7-50cd175f0414 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.206 2 DEBUG nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.209 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.212 2 INFO nova.virt.libvirt.driver [-] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Instance spawned successfully.
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.213 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:13:15 compute-0 podman[221519]: 2025-10-06 14:13:15.217322283 +0000 UTC m=+0.163687392 container start e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 14:13:15 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[221534]: [NOTICE]   (221538) : New worker (221540) forked
Oct 06 14:13:15 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[221534]: [NOTICE]   (221538) : Loading success.
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.730 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.731 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.732 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.733 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.734 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:13:15 compute-0 nova_compute[192903]: 2025-10-06 14:13:15.735 2 DEBUG nova.virt.libvirt.driver [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:13:16 compute-0 nova_compute[192903]: 2025-10-06 14:13:16.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:16 compute-0 nova_compute[192903]: 2025-10-06 14:13:16.246 2 INFO nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Took 10.44 seconds to spawn the instance on the hypervisor.
Oct 06 14:13:16 compute-0 nova_compute[192903]: 2025-10-06 14:13:16.246 2 DEBUG nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:13:16 compute-0 nova_compute[192903]: 2025-10-06 14:13:16.786 2 INFO nova.compute.manager [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Took 15.67 seconds to build instance.
Oct 06 14:13:17 compute-0 nova_compute[192903]: 2025-10-06 14:13:17.293 2 DEBUG oslo_concurrency.lockutils [None req-b6b29fae-d7d2-4a7d-94e4-2b84e6253cc9 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.195s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:17 compute-0 nova_compute[192903]: 2025-10-06 14:13:17.303 2 DEBUG nova.compute.manager [req-c4db6030-1ff8-4518-8591-2d0798515621 req-1f04b4fc-5501-44e8-867b-aa7973543fd3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Received event network-vif-plugged-49d1791e-0799-488d-b6c7-50cd175f0414 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:13:17 compute-0 nova_compute[192903]: 2025-10-06 14:13:17.303 2 DEBUG oslo_concurrency.lockutils [req-c4db6030-1ff8-4518-8591-2d0798515621 req-1f04b4fc-5501-44e8-867b-aa7973543fd3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:17 compute-0 nova_compute[192903]: 2025-10-06 14:13:17.304 2 DEBUG oslo_concurrency.lockutils [req-c4db6030-1ff8-4518-8591-2d0798515621 req-1f04b4fc-5501-44e8-867b-aa7973543fd3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:17 compute-0 nova_compute[192903]: 2025-10-06 14:13:17.304 2 DEBUG oslo_concurrency.lockutils [req-c4db6030-1ff8-4518-8591-2d0798515621 req-1f04b4fc-5501-44e8-867b-aa7973543fd3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:17 compute-0 nova_compute[192903]: 2025-10-06 14:13:17.305 2 DEBUG nova.compute.manager [req-c4db6030-1ff8-4518-8591-2d0798515621 req-1f04b4fc-5501-44e8-867b-aa7973543fd3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] No waiting events found dispatching network-vif-plugged-49d1791e-0799-488d-b6c7-50cd175f0414 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:13:17 compute-0 nova_compute[192903]: 2025-10-06 14:13:17.305 2 WARNING nova.compute.manager [req-c4db6030-1ff8-4518-8591-2d0798515621 req-1f04b4fc-5501-44e8-867b-aa7973543fd3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Received unexpected event network-vif-plugged-49d1791e-0799-488d-b6c7-50cd175f0414 for instance with vm_state active and task_state None.
Oct 06 14:13:19 compute-0 nova_compute[192903]: 2025-10-06 14:13:19.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:21 compute-0 nova_compute[192903]: 2025-10-06 14:13:21.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:24 compute-0 nova_compute[192903]: 2025-10-06 14:13:24.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:26 compute-0 nova_compute[192903]: 2025-10-06 14:13:26.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:27 compute-0 ovn_controller[95205]: 2025-10-06T14:13:27Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:ed:4b 10.100.0.14
Oct 06 14:13:27 compute-0 ovn_controller[95205]: 2025-10-06T14:13:27Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:ed:4b 10.100.0.14
Oct 06 14:13:29 compute-0 nova_compute[192903]: 2025-10-06 14:13:29.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:29 compute-0 podman[221564]: 2025-10-06 14:13:29.207841058 +0000 UTC m=+0.069089115 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:13:29 compute-0 podman[203308]: time="2025-10-06T14:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:13:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:13:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3468 "" "Go-http-client/1.1"
Oct 06 14:13:30 compute-0 nova_compute[192903]: 2025-10-06 14:13:30.369 2 DEBUG nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Creating tmpfile /var/lib/nova/instances/tmpp59a7gvc to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:13:30 compute-0 nova_compute[192903]: 2025-10-06 14:13:30.370 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:30 compute-0 nova_compute[192903]: 2025-10-06 14:13:30.389 2 DEBUG nova.compute.manager [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp59a7gvc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:13:30 compute-0 podman[221591]: 2025-10-06 14:13:30.49603791 +0000 UTC m=+0.071709424 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 06 14:13:30 compute-0 podman[221590]: 2025-10-06 14:13:30.549242574 +0000 UTC m=+0.120860811 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:13:30 compute-0 podman[221589]: 2025-10-06 14:13:30.573040662 +0000 UTC m=+0.146512338 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Oct 06 14:13:31 compute-0 nova_compute[192903]: 2025-10-06 14:13:31.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:31 compute-0 openstack_network_exporter[205500]: ERROR   14:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:13:31 compute-0 openstack_network_exporter[205500]: ERROR   14:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:13:31 compute-0 openstack_network_exporter[205500]: ERROR   14:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:13:31 compute-0 openstack_network_exporter[205500]: ERROR   14:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:13:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:13:31 compute-0 openstack_network_exporter[205500]: ERROR   14:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:13:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:13:32 compute-0 nova_compute[192903]: 2025-10-06 14:13:32.444 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:34 compute-0 nova_compute[192903]: 2025-10-06 14:13:34.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:36 compute-0 nova_compute[192903]: 2025-10-06 14:13:36.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:36 compute-0 nova_compute[192903]: 2025-10-06 14:13:36.812 2 DEBUG nova.compute.manager [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp59a7gvc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a7577fff-cf4c-4c47-a754-b5e0b86ad3e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:13:37 compute-0 nova_compute[192903]: 2025-10-06 14:13:37.830 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-a7577fff-cf4c-4c47-a754-b5e0b86ad3e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:13:37 compute-0 nova_compute[192903]: 2025-10-06 14:13:37.831 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-a7577fff-cf4c-4c47-a754-b5e0b86ad3e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:13:37 compute-0 nova_compute[192903]: 2025-10-06 14:13:37.831 2 DEBUG nova.network.neutron [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:13:38 compute-0 nova_compute[192903]: 2025-10-06 14:13:38.352 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:39 compute-0 nova_compute[192903]: 2025-10-06 14:13:39.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:39 compute-0 nova_compute[192903]: 2025-10-06 14:13:39.323 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:39 compute-0 nova_compute[192903]: 2025-10-06 14:13:39.600 2 DEBUG nova.network.neutron [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Updating instance_info_cache with network_info: [{"id": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "address": "fa:16:3e:3f:f6:98", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70bbf3ec-6e", "ovs_interfaceid": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.106 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-a7577fff-cf4c-4c47-a754-b5e0b86ad3e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.122 2 DEBUG nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp59a7gvc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a7577fff-cf4c-4c47-a754-b5e0b86ad3e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.123 2 DEBUG nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Creating instance directory: /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.124 2 DEBUG nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Creating disk.info with the contents: {'/var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk': 'qcow2', '/var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.124 2 DEBUG nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.125 2 DEBUG nova.objects.instance [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid a7577fff-cf4c-4c47-a754-b5e0b86ad3e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.633 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.639 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.641 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.704 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.705 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.705 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.706 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.710 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.711 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.796 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.798 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.840 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.841 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.842 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.904 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.906 2 DEBUG nova.virt.disk.api [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.907 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.969 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.971 2 DEBUG nova.virt.disk.api [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:13:40 compute-0 nova_compute[192903]: 2025-10-06 14:13:40.972 2 DEBUG nova.objects.instance [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid a7577fff-cf4c-4c47-a754-b5e0b86ad3e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:41 compute-0 podman[221669]: 2025-10-06 14:13:41.24563664 +0000 UTC m=+0.094566807 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.483 2 DEBUG nova.objects.base [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<a7577fff-cf4c-4c47-a754-b5e0b86ad3e0> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.484 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.527 2 DEBUG oslo_concurrency.processutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk.config 497664" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.529 2 DEBUG nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.531 2 DEBUG nova.virt.libvirt.vif [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:12:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1243891477',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1243891477',id=14,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:12:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-mp68fr77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:12:56Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=a7577fff-cf4c-4c47-a754-b5e0b86ad3e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "address": "fa:16:3e:3f:f6:98", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70bbf3ec-6e", "ovs_interfaceid": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.532 2 DEBUG nova.network.os_vif_util [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "address": "fa:16:3e:3f:f6:98", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70bbf3ec-6e", "ovs_interfaceid": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.533 2 DEBUG nova.network.os_vif_util [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:f6:98,bridge_name='br-int',has_traffic_filtering=True,id=70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70bbf3ec-6e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.534 2 DEBUG os_vif [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:f6:98,bridge_name='br-int',has_traffic_filtering=True,id=70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70bbf3ec-6e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.536 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.537 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.539 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '34003ff6-e657-5ee4-90ed-8bf4ed1a2ea2', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.546 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70bbf3ec-6e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.546 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap70bbf3ec-6e, col_values=(('qos', UUID('c03b0a8c-480e-4829-817b-be837721910e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.546 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap70bbf3ec-6e, col_values=(('external_ids', {'iface-id': '70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:f6:98', 'vm-uuid': 'a7577fff-cf4c-4c47-a754-b5e0b86ad3e0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:41 compute-0 NetworkManager[52035]: <info>  [1759760021.5483] manager: (tap70bbf3ec-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.555 2 INFO os_vif [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:f6:98,bridge_name='br-int',has_traffic_filtering=True,id=70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70bbf3ec-6e')
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.556 2 DEBUG nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.556 2 DEBUG nova.compute.manager [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp59a7gvc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a7577fff-cf4c-4c47-a754-b5e0b86ad3e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.557 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:41 compute-0 nova_compute[192903]: 2025-10-06 14:13:41.990 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:43 compute-0 nova_compute[192903]: 2025-10-06 14:13:43.247 2 DEBUG nova.network.neutron [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Port 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:13:43 compute-0 nova_compute[192903]: 2025-10-06 14:13:43.263 2 DEBUG nova.compute.manager [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp59a7gvc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a7577fff-cf4c-4c47-a754-b5e0b86ad3e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:13:44 compute-0 nova_compute[192903]: 2025-10-06 14:13:44.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:44 compute-0 podman[221694]: 2025-10-06 14:13:44.226608219 +0000 UTC m=+0.065453658 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Oct 06 14:13:44 compute-0 ovn_controller[95205]: 2025-10-06T14:13:44Z|00139|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct 06 14:13:46 compute-0 kernel: tap70bbf3ec-6e: entered promiscuous mode
Oct 06 14:13:46 compute-0 NetworkManager[52035]: <info>  [1759760026.0769] manager: (tap70bbf3ec-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Oct 06 14:13:46 compute-0 ovn_controller[95205]: 2025-10-06T14:13:46Z|00140|binding|INFO|Claiming lport 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b for this additional chassis.
Oct 06 14:13:46 compute-0 ovn_controller[95205]: 2025-10-06T14:13:46Z|00141|binding|INFO|70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b: Claiming fa:16:3e:3f:f6:98 10.100.0.4
Oct 06 14:13:46 compute-0 nova_compute[192903]: 2025-10-06 14:13:46.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.088 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:f6:98 10.100.0.4'], port_security=['fa:16:3e:3f:f6:98 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a7577fff-cf4c-4c47-a754-b5e0b86ad3e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.090 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 unbound from our chassis
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.091 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:13:46 compute-0 ovn_controller[95205]: 2025-10-06T14:13:46Z|00142|binding|INFO|Setting lport 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b ovn-installed in OVS
Oct 06 14:13:46 compute-0 nova_compute[192903]: 2025-10-06 14:13:46.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:46 compute-0 nova_compute[192903]: 2025-10-06 14:13:46.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.121 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6bbe00d1-56f6-44ad-a077-591ea590dbcf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:46 compute-0 systemd-udevd[221731]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:13:46 compute-0 systemd-machined[152985]: New machine qemu-12-instance-0000000e.
Oct 06 14:13:46 compute-0 NetworkManager[52035]: <info>  [1759760026.1616] device (tap70bbf3ec-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:13:46 compute-0 NetworkManager[52035]: <info>  [1759760026.1633] device (tap70bbf3ec-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:13:46 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000e.
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.169 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[54e3ce0b-7628-4571-97e0-929ba2cd7717]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.174 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[eb200cfa-1160-4934-905f-7b5b6c2adc57]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.221 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[06a4c269-ff0c-43be-b764-5a6328a36814]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.243 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[08076599-39cd-4232-b452-ec67d592ae34]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37630f0a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:70:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445612, 'reachable_time': 44848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221742, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.265 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7409de95-7ce7-40d5-9305-c776c2888518]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap37630f0a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445626, 'tstamp': 445626}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221744, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap37630f0a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445629, 'tstamp': 445629}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221744, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.267 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37630f0a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:46 compute-0 nova_compute[192903]: 2025-10-06 14:13:46.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:46 compute-0 nova_compute[192903]: 2025-10-06 14:13:46.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.271 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37630f0a-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.271 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.272 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap37630f0a-80, col_values=(('external_ids', {'iface-id': '01e7ff9b-7072-42b9-b412-c40a88736ea9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.272 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:13:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:13:46.274 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a20c675c-7552-43c4-9a29-803de32686a2]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-37630f0a-8aad-4e9a-8c81-a92f8d673f93\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 37630f0a-8aad-4e9a-8c81-a92f8d673f93\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:13:46 compute-0 nova_compute[192903]: 2025-10-06 14:13:46.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:48 compute-0 ovn_controller[95205]: 2025-10-06T14:13:48Z|00143|binding|INFO|Claiming lport 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b for this chassis.
Oct 06 14:13:48 compute-0 ovn_controller[95205]: 2025-10-06T14:13:48Z|00144|binding|INFO|70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b: Claiming fa:16:3e:3f:f6:98 10.100.0.4
Oct 06 14:13:48 compute-0 ovn_controller[95205]: 2025-10-06T14:13:48Z|00145|binding|INFO|Setting lport 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b up in Southbound
Oct 06 14:13:48 compute-0 nova_compute[192903]: 2025-10-06 14:13:48.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:13:48 compute-0 nova_compute[192903]: 2025-10-06 14:13:48.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 14:13:49 compute-0 nova_compute[192903]: 2025-10-06 14:13:49.090 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 14:13:49 compute-0 nova_compute[192903]: 2025-10-06 14:13:49.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:49 compute-0 nova_compute[192903]: 2025-10-06 14:13:49.633 2 INFO nova.compute.manager [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Post operation of migration started
Oct 06 14:13:49 compute-0 nova_compute[192903]: 2025-10-06 14:13:49.635 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:49 compute-0 nova_compute[192903]: 2025-10-06 14:13:49.773 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:49 compute-0 nova_compute[192903]: 2025-10-06 14:13:49.774 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:49 compute-0 nova_compute[192903]: 2025-10-06 14:13:49.886 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-a7577fff-cf4c-4c47-a754-b5e0b86ad3e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:13:49 compute-0 nova_compute[192903]: 2025-10-06 14:13:49.886 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-a7577fff-cf4c-4c47-a754-b5e0b86ad3e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:13:49 compute-0 nova_compute[192903]: 2025-10-06 14:13:49.887 2 DEBUG nova.network.neutron [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:13:50 compute-0 nova_compute[192903]: 2025-10-06 14:13:50.394 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:51 compute-0 nova_compute[192903]: 2025-10-06 14:13:51.257 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:51 compute-0 nova_compute[192903]: 2025-10-06 14:13:51.412 2 DEBUG nova.network.neutron [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Updating instance_info_cache with network_info: [{"id": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "address": "fa:16:3e:3f:f6:98", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70bbf3ec-6e", "ovs_interfaceid": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:13:51 compute-0 nova_compute[192903]: 2025-10-06 14:13:51.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:51 compute-0 nova_compute[192903]: 2025-10-06 14:13:51.920 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-a7577fff-cf4c-4c47-a754-b5e0b86ad3e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:13:52 compute-0 nova_compute[192903]: 2025-10-06 14:13:52.441 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:52 compute-0 nova_compute[192903]: 2025-10-06 14:13:52.441 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:52 compute-0 nova_compute[192903]: 2025-10-06 14:13:52.442 2 DEBUG oslo_concurrency.lockutils [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:52 compute-0 nova_compute[192903]: 2025-10-06 14:13:52.447 2 INFO nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:13:52 compute-0 virtqemud[192802]: Domain id=12 name='instance-0000000e' uuid=a7577fff-cf4c-4c47-a754-b5e0b86ad3e0 is tainted: custom-monitor
Oct 06 14:13:53 compute-0 nova_compute[192903]: 2025-10-06 14:13:53.090 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:13:53 compute-0 nova_compute[192903]: 2025-10-06 14:13:53.455 2 INFO nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:13:54 compute-0 nova_compute[192903]: 2025-10-06 14:13:54.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:54 compute-0 nova_compute[192903]: 2025-10-06 14:13:54.463 2 INFO nova.virt.libvirt.driver [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:13:54 compute-0 nova_compute[192903]: 2025-10-06 14:13:54.468 2 DEBUG nova.compute.manager [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:13:54 compute-0 nova_compute[192903]: 2025-10-06 14:13:54.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:13:54 compute-0 nova_compute[192903]: 2025-10-06 14:13:54.982 2 DEBUG nova.objects.instance [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:13:55 compute-0 nova_compute[192903]: 2025-10-06 14:13:55.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:55 compute-0 nova_compute[192903]: 2025-10-06 14:13:55.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:55 compute-0 nova_compute[192903]: 2025-10-06 14:13:55.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:55 compute-0 nova_compute[192903]: 2025-10-06 14:13:55.096 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.001 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.096 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.096 2 WARNING neutronclient.v2_0.client [None req-ddbe2f2f-619f-4926-8f0d-10a248738b31 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.150 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.240 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.241 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.291 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.296 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.354 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.355 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.427 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.663 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.665 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.693 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.694 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5539MB free_disk=73.24469757080078GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.694 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:13:56 compute-0 nova_compute[192903]: 2025-10-06 14:13:56.695 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:13:57 compute-0 nova_compute[192903]: 2025-10-06 14:13:57.715 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Applying migration context for instance a7577fff-cf4c-4c47-a754-b5e0b86ad3e0 as it has an incoming, in-progress migration 548d1e24-f066-457a-862c-1bf3236c08ca. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 06 14:13:57 compute-0 nova_compute[192903]: 2025-10-06 14:13:57.716 2 DEBUG nova.objects.instance [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:13:58 compute-0 nova_compute[192903]: 2025-10-06 14:13:58.223 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 06 14:13:58 compute-0 nova_compute[192903]: 2025-10-06 14:13:58.259 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance cd58c0d7-99da-40d0-b5df-9a3dbac42360 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:13:58 compute-0 nova_compute[192903]: 2025-10-06 14:13:58.260 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance a7577fff-cf4c-4c47-a754-b5e0b86ad3e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:13:58 compute-0 nova_compute[192903]: 2025-10-06 14:13:58.260 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:13:58 compute-0 nova_compute[192903]: 2025-10-06 14:13:58.260 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:13:56 up  1:14,  0 user,  load average: 0.47, 0.47, 0.45\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_58ece9e5771a44c2918fd8f7783186f0': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:13:58 compute-0 nova_compute[192903]: 2025-10-06 14:13:58.327 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:13:58 compute-0 nova_compute[192903]: 2025-10-06 14:13:58.838 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:13:59 compute-0 nova_compute[192903]: 2025-10-06 14:13:59.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:13:59 compute-0 nova_compute[192903]: 2025-10-06 14:13:59.351 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:13:59 compute-0 nova_compute[192903]: 2025-10-06 14:13:59.351 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.656s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:13:59 compute-0 nova_compute[192903]: 2025-10-06 14:13:59.352 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:13:59 compute-0 podman[203308]: time="2025-10-06T14:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:13:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:13:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.079 2 DEBUG oslo_concurrency.lockutils [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.079 2 DEBUG oslo_concurrency.lockutils [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.080 2 DEBUG oslo_concurrency.lockutils [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.081 2 DEBUG oslo_concurrency.lockutils [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.081 2 DEBUG oslo_concurrency.lockutils [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.100 2 INFO nova.compute.manager [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Terminating instance
Oct 06 14:14:00 compute-0 podman[221781]: 2025-10-06 14:14:00.206117822 +0000 UTC m=+0.066776833 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.623 2 DEBUG nova.compute.manager [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:14:00 compute-0 kernel: tap49d1791e-07 (unregistering): left promiscuous mode
Oct 06 14:14:00 compute-0 NetworkManager[52035]: <info>  [1759760040.6573] device (tap49d1791e-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:14:00 compute-0 ovn_controller[95205]: 2025-10-06T14:14:00Z|00146|binding|INFO|Releasing lport 49d1791e-0799-488d-b6c7-50cd175f0414 from this chassis (sb_readonly=0)
Oct 06 14:14:00 compute-0 ovn_controller[95205]: 2025-10-06T14:14:00Z|00147|binding|INFO|Setting lport 49d1791e-0799-488d-b6c7-50cd175f0414 down in Southbound
Oct 06 14:14:00 compute-0 ovn_controller[95205]: 2025-10-06T14:14:00Z|00148|binding|INFO|Removing iface tap49d1791e-07 ovn-installed in OVS
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.675 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:ed:4b 10.100.0.14'], port_security=['fa:16:3e:35:ed:4b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'cd58c0d7-99da-40d0-b5df-9a3dbac42360', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=49d1791e-0799-488d-b6c7-50cd175f0414) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.676 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 49d1791e-0799-488d-b6c7-50cd175f0414 in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 unbound from our chassis
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.677 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.715 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4d224b-8af3-48e3-9cc3-8fd9f5abd686]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.747 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9f647d-ac3e-4c64-aa97-61093ee2c22d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.752 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[917f8f6a-c564-4c86-bcb3-ee08650a8968]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:00 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Oct 06 14:14:00 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 14.144s CPU time.
Oct 06 14:14:00 compute-0 systemd-machined[152985]: Machine qemu-11-instance-0000000f terminated.
Oct 06 14:14:00 compute-0 podman[221808]: 2025-10-06 14:14:00.780811641 +0000 UTC m=+0.084794829 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd)
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.783 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[73ab269f-2071-4cbb-bda0-b8b70f220381]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:00 compute-0 podman[221807]: 2025-10-06 14:14:00.795792307 +0000 UTC m=+0.111792762 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.797 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9847dce6-8479-4c49-82d9-2ee14a658d06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37630f0a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:70:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445612, 'reachable_time': 44848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221875, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:00 compute-0 podman[221809]: 2025-10-06 14:14:00.803540901 +0000 UTC m=+0.105143346 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.813 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[fad5051e-670e-4526-bc2e-74907db9f33c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap37630f0a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445626, 'tstamp': 445626}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221876, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap37630f0a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445629, 'tstamp': 445629}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221876, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.815 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37630f0a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.821 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37630f0a-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.821 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.821 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap37630f0a-80, col_values=(('external_ids', {'iface-id': '01e7ff9b-7072-42b9-b412-c40a88736ea9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.821 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:14:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:00.822 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[98b169bb-a6d7-424e-b6f4-4748cc4c9ea7]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-37630f0a-8aad-4e9a-8c81-a92f8d673f93\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 37630f0a-8aad-4e9a-8c81-a92f8d673f93\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.885 2 INFO nova.virt.libvirt.driver [-] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Instance destroyed successfully.
Oct 06 14:14:00 compute-0 nova_compute[192903]: 2025-10-06 14:14:00.886 2 DEBUG nova.objects.instance [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lazy-loading 'resources' on Instance uuid cd58c0d7-99da-40d0-b5df-9a3dbac42360 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.086 2 DEBUG nova.compute.manager [req-7294343a-fbe1-4cb2-9590-262c217faffc req-b12e2d5e-c5aa-4f3b-a560-51a250c51cf3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Received event network-vif-unplugged-49d1791e-0799-488d-b6c7-50cd175f0414 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.086 2 DEBUG oslo_concurrency.lockutils [req-7294343a-fbe1-4cb2-9590-262c217faffc req-b12e2d5e-c5aa-4f3b-a560-51a250c51cf3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.087 2 DEBUG oslo_concurrency.lockutils [req-7294343a-fbe1-4cb2-9590-262c217faffc req-b12e2d5e-c5aa-4f3b-a560-51a250c51cf3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.087 2 DEBUG oslo_concurrency.lockutils [req-7294343a-fbe1-4cb2-9590-262c217faffc req-b12e2d5e-c5aa-4f3b-a560-51a250c51cf3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.087 2 DEBUG nova.compute.manager [req-7294343a-fbe1-4cb2-9590-262c217faffc req-b12e2d5e-c5aa-4f3b-a560-51a250c51cf3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] No waiting events found dispatching network-vif-unplugged-49d1791e-0799-488d-b6c7-50cd175f0414 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.087 2 DEBUG nova.compute.manager [req-7294343a-fbe1-4cb2-9590-262c217faffc req-b12e2d5e-c5aa-4f3b-a560-51a250c51cf3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Received event network-vif-unplugged-49d1791e-0799-488d-b6c7-50cd175f0414 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.400 2 DEBUG nova.virt.libvirt.vif [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:12:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-838001624',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-838001624',id=15,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:13:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-upqtbhcd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:13:16Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=cd58c0d7-99da-40d0-b5df-9a3dbac42360,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.401 2 DEBUG nova.network.os_vif_util [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converting VIF {"id": "49d1791e-0799-488d-b6c7-50cd175f0414", "address": "fa:16:3e:35:ed:4b", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49d1791e-07", "ovs_interfaceid": "49d1791e-0799-488d-b6c7-50cd175f0414", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.402 2 DEBUG nova.network.os_vif_util [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:ed:4b,bridge_name='br-int',has_traffic_filtering=True,id=49d1791e-0799-488d-b6c7-50cd175f0414,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49d1791e-07') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.402 2 DEBUG os_vif [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:ed:4b,bridge_name='br-int',has_traffic_filtering=True,id=49d1791e-0799-488d-b6c7-50cd175f0414,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49d1791e-07') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.405 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49d1791e-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:01 compute-0 openstack_network_exporter[205500]: ERROR   14:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:14:01 compute-0 openstack_network_exporter[205500]: ERROR   14:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:14:01 compute-0 openstack_network_exporter[205500]: ERROR   14:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:14:01 compute-0 openstack_network_exporter[205500]: ERROR   14:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:14:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:14:01 compute-0 openstack_network_exporter[205500]: ERROR   14:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:14:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.413 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=1777c7e6-7c88-446a-9afd-0d3fa7d1fc94) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.422 2 INFO os_vif [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:ed:4b,bridge_name='br-int',has_traffic_filtering=True,id=49d1791e-0799-488d-b6c7-50cd175f0414,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49d1791e-07')
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.425 2 INFO nova.virt.libvirt.driver [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Deleting instance files /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360_del
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.426 2 INFO nova.virt.libvirt.driver [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Deletion of /var/lib/nova/instances/cd58c0d7-99da-40d0-b5df-9a3dbac42360_del complete
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.858 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.858 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.858 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.858 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.859 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.942 2 INFO nova.compute.manager [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.942 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.943 2 DEBUG nova.compute.manager [-] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.943 2 DEBUG nova.network.neutron [-] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:14:01 compute-0 nova_compute[192903]: 2025-10-06 14:14:01.943 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:14:02 compute-0 nova_compute[192903]: 2025-10-06 14:14:02.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:02 compute-0 nova_compute[192903]: 2025-10-06 14:14:02.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:02 compute-0 nova_compute[192903]: 2025-10-06 14:14:02.979 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:14:03 compute-0 nova_compute[192903]: 2025-10-06 14:14:03.149 2 DEBUG nova.compute.manager [req-ce17c48e-cb03-4c15-a6d6-426cfa2f39ab req-1cc400ed-b301-4dce-8745-9958d0551450 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Received event network-vif-unplugged-49d1791e-0799-488d-b6c7-50cd175f0414 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:14:03 compute-0 nova_compute[192903]: 2025-10-06 14:14:03.150 2 DEBUG oslo_concurrency.lockutils [req-ce17c48e-cb03-4c15-a6d6-426cfa2f39ab req-1cc400ed-b301-4dce-8745-9958d0551450 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:03 compute-0 nova_compute[192903]: 2025-10-06 14:14:03.150 2 DEBUG oslo_concurrency.lockutils [req-ce17c48e-cb03-4c15-a6d6-426cfa2f39ab req-1cc400ed-b301-4dce-8745-9958d0551450 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:03 compute-0 nova_compute[192903]: 2025-10-06 14:14:03.151 2 DEBUG oslo_concurrency.lockutils [req-ce17c48e-cb03-4c15-a6d6-426cfa2f39ab req-1cc400ed-b301-4dce-8745-9958d0551450 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:03 compute-0 nova_compute[192903]: 2025-10-06 14:14:03.151 2 DEBUG nova.compute.manager [req-ce17c48e-cb03-4c15-a6d6-426cfa2f39ab req-1cc400ed-b301-4dce-8745-9958d0551450 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] No waiting events found dispatching network-vif-unplugged-49d1791e-0799-488d-b6c7-50cd175f0414 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:14:03 compute-0 nova_compute[192903]: 2025-10-06 14:14:03.151 2 DEBUG nova.compute.manager [req-ce17c48e-cb03-4c15-a6d6-426cfa2f39ab req-1cc400ed-b301-4dce-8745-9958d0551450 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Received event network-vif-unplugged-49d1791e-0799-488d-b6c7-50cd175f0414 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:14:03 compute-0 nova_compute[192903]: 2025-10-06 14:14:03.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:03 compute-0 nova_compute[192903]: 2025-10-06 14:14:03.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 14:14:04 compute-0 nova_compute[192903]: 2025-10-06 14:14:04.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:04 compute-0 nova_compute[192903]: 2025-10-06 14:14:04.755 2 DEBUG nova.network.neutron [-] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:14:05 compute-0 nova_compute[192903]: 2025-10-06 14:14:05.233 2 DEBUG nova.compute.manager [req-8d2654bd-109a-4e57-bbd2-af558ed62f3c req-5b61ebce-bd50-4215-9dd0-6f01cce3c5cd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Received event network-vif-deleted-49d1791e-0799-488d-b6c7-50cd175f0414 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:14:05 compute-0 nova_compute[192903]: 2025-10-06 14:14:05.272 2 INFO nova.compute.manager [-] [instance: cd58c0d7-99da-40d0-b5df-9a3dbac42360] Took 3.33 seconds to deallocate network for instance.
Oct 06 14:14:05 compute-0 nova_compute[192903]: 2025-10-06 14:14:05.801 2 DEBUG oslo_concurrency.lockutils [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:05 compute-0 nova_compute[192903]: 2025-10-06 14:14:05.802 2 DEBUG oslo_concurrency.lockutils [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:05 compute-0 nova_compute[192903]: 2025-10-06 14:14:05.890 2 DEBUG nova.compute.provider_tree [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:14:06 compute-0 nova_compute[192903]: 2025-10-06 14:14:06.397 2 DEBUG nova.scheduler.client.report [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:14:06 compute-0 nova_compute[192903]: 2025-10-06 14:14:06.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:06 compute-0 nova_compute[192903]: 2025-10-06 14:14:06.910 2 DEBUG oslo_concurrency.lockutils [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:06 compute-0 nova_compute[192903]: 2025-10-06 14:14:06.952 2 INFO nova.scheduler.client.report [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Deleted allocations for instance cd58c0d7-99da-40d0-b5df-9a3dbac42360
Oct 06 14:14:07 compute-0 nova_compute[192903]: 2025-10-06 14:14:07.985 2 DEBUG oslo_concurrency.lockutils [None req-19f0d9c4-8a14-4be4-a2a1-e8ea02297115 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "cd58c0d7-99da-40d0-b5df-9a3dbac42360" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.906s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.118 2 DEBUG oslo_concurrency.lockutils [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.119 2 DEBUG oslo_concurrency.lockutils [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.119 2 DEBUG oslo_concurrency.lockutils [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.120 2 DEBUG oslo_concurrency.lockutils [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.120 2 DEBUG oslo_concurrency.lockutils [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.133 2 INFO nova.compute.manager [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Terminating instance
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.650 2 DEBUG nova.compute.manager [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:14:09 compute-0 kernel: tap70bbf3ec-6e (unregistering): left promiscuous mode
Oct 06 14:14:09 compute-0 NetworkManager[52035]: <info>  [1759760049.6790] device (tap70bbf3ec-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:14:09 compute-0 ovn_controller[95205]: 2025-10-06T14:14:09Z|00149|binding|INFO|Releasing lport 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b from this chassis (sb_readonly=0)
Oct 06 14:14:09 compute-0 ovn_controller[95205]: 2025-10-06T14:14:09Z|00150|binding|INFO|Setting lport 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b down in Southbound
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:09 compute-0 ovn_controller[95205]: 2025-10-06T14:14:09Z|00151|binding|INFO|Removing iface tap70bbf3ec-6e ovn-installed in OVS
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:09.700 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:f6:98 10.100.0.4'], port_security=['fa:16:3e:3f:f6:98 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a7577fff-cf4c-4c47-a754-b5e0b86ad3e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58ece9e5771a44c2918fd8f7783186f0', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'c7ea2fea-f20b-4c5b-b10c-5c34958c77de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa0b3cc2-3f34-41b3-b7d0-2541da68c0c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:14:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:09.704 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b in datapath 37630f0a-8aad-4e9a-8c81-a92f8d673f93 unbound from our chassis
Oct 06 14:14:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:09.707 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37630f0a-8aad-4e9a-8c81-a92f8d673f93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:14:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:09.709 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[93311311-318f-437d-a823-485b792fb3cc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:09.710 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 namespace which is not needed anymore
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:09 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Oct 06 14:14:09 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Consumed 2.380s CPU time.
Oct 06 14:14:09 compute-0 systemd-machined[152985]: Machine qemu-12-instance-0000000e terminated.
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:09 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[221534]: [NOTICE]   (221538) : haproxy version is 3.0.5-8e879a5
Oct 06 14:14:09 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[221534]: [NOTICE]   (221538) : path to executable is /usr/sbin/haproxy
Oct 06 14:14:09 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[221534]: [WARNING]  (221538) : Exiting Master process...
Oct 06 14:14:09 compute-0 podman[221927]: 2025-10-06 14:14:09.904788504 +0000 UTC m=+0.049430036 container kill e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:14:09 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[221534]: [ALERT]    (221538) : Current worker (221540) exited with code 143 (Terminated)
Oct 06 14:14:09 compute-0 neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93[221534]: [WARNING]  (221538) : All workers exited. Exiting... (0)
Oct 06 14:14:09 compute-0 systemd[1]: libpod-e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8.scope: Deactivated successfully.
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.929 2 INFO nova.virt.libvirt.driver [-] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Instance destroyed successfully.
Oct 06 14:14:09 compute-0 nova_compute[192903]: 2025-10-06 14:14:09.930 2 DEBUG nova.objects.instance [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lazy-loading 'resources' on Instance uuid a7577fff-cf4c-4c47-a754-b5e0b86ad3e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:14:09 compute-0 podman[221956]: 2025-10-06 14:14:09.958227315 +0000 UTC m=+0.032198031 container died e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 14:14:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8-userdata-shm.mount: Deactivated successfully.
Oct 06 14:14:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-c9405d394655f38f237084b123667cad6fae7c6082bb89166b63947340d00f9a-merged.mount: Deactivated successfully.
Oct 06 14:14:10 compute-0 podman[221956]: 2025-10-06 14:14:10.002088242 +0000 UTC m=+0.076058978 container cleanup e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 14:14:10 compute-0 systemd[1]: libpod-conmon-e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8.scope: Deactivated successfully.
Oct 06 14:14:10 compute-0 podman[221960]: 2025-10-06 14:14:10.018018603 +0000 UTC m=+0.077658741 container remove e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.025 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c7b9d14f-4b32-40f5-a82f-f6e9cf513e58]: (4, ("Mon Oct  6 02:14:09 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 (e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8)\ne323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8\nMon Oct  6 02:14:09 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 (e323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8)\ne323b9c6b6e4f0e8513988a2ef69735302455e9853bf1d9b8b4921d5602241a8\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.026 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc4d062-7ed4-4876-8cb6-1585d7634a1f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.026 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37630f0a-8aad-4e9a-8c81-a92f8d673f93.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.027 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[fa1d5fcd-1a82-46c1-a19c-56267f180a32]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.027 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37630f0a-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 kernel: tap37630f0a-80: left promiscuous mode
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.057 2 DEBUG nova.compute.manager [req-71e0615a-bfe5-413c-ab55-376bf64a311f req-a7938e6c-51df-4b2c-b1d2-816756567ad4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Received event network-vif-unplugged-70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.058 2 DEBUG oslo_concurrency.lockutils [req-71e0615a-bfe5-413c-ab55-376bf64a311f req-a7938e6c-51df-4b2c-b1d2-816756567ad4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.058 2 DEBUG oslo_concurrency.lockutils [req-71e0615a-bfe5-413c-ab55-376bf64a311f req-a7938e6c-51df-4b2c-b1d2-816756567ad4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.058 2 DEBUG oslo_concurrency.lockutils [req-71e0615a-bfe5-413c-ab55-376bf64a311f req-a7938e6c-51df-4b2c-b1d2-816756567ad4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.059 2 DEBUG nova.compute.manager [req-71e0615a-bfe5-413c-ab55-376bf64a311f req-a7938e6c-51df-4b2c-b1d2-816756567ad4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] No waiting events found dispatching network-vif-unplugged-70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.059 2 DEBUG nova.compute.manager [req-71e0615a-bfe5-413c-ab55-376bf64a311f req-a7938e6c-51df-4b2c-b1d2-816756567ad4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Received event network-vif-unplugged-70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.063 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[79dc7975-93d8-4202-869d-7dd7b4f9af91]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.090 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[129b5713-efed-482b-ae5f-fbd5e1daa854]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.091 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9450bc91-5ec2-42ec-8bba-547d1f2aade6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.112 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[790a70f7-b486-47e9-b181-0d1b0534bd80]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445604, 'reachable_time': 32333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221991, 'error': None, 'target': 'ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:10 compute-0 systemd[1]: run-netns-ovnmeta\x2d37630f0a\x2d8aad\x2d4e9a\x2d8c81\x2da92f8d673f93.mount: Deactivated successfully.
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.116 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-37630f0a-8aad-4e9a-8c81-a92f8d673f93 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.116 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[eeed870c-3c6e-4d41-b1f4-01c4ee3dfbba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.126 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:14:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:10.126 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.438 2 DEBUG nova.virt.libvirt.vif [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:12:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1243891477',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1243891477',id=14,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:12:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58ece9e5771a44c2918fd8f7783186f0',ramdisk_id='',reservation_id='r-mp68fr77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-251874218',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-251874218-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:13:55Z,user_data=None,user_id='f242e9aec50346eaa7b3bddbda127d84',uuid=a7577fff-cf4c-4c47-a754-b5e0b86ad3e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "address": "fa:16:3e:3f:f6:98", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70bbf3ec-6e", "ovs_interfaceid": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.438 2 DEBUG nova.network.os_vif_util [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converting VIF {"id": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "address": "fa:16:3e:3f:f6:98", "network": {"id": "37630f0a-8aad-4e9a-8c81-a92f8d673f93", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-15551358-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "040822eef8234394a03ec96f615f5048", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70bbf3ec-6e", "ovs_interfaceid": "70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.439 2 DEBUG nova.network.os_vif_util [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:f6:98,bridge_name='br-int',has_traffic_filtering=True,id=70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70bbf3ec-6e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.439 2 DEBUG os_vif [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:f6:98,bridge_name='br-int',has_traffic_filtering=True,id=70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70bbf3ec-6e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.441 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70bbf3ec-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.446 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c03b0a8c-480e-4829-817b-be837721910e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.451 2 INFO os_vif [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:f6:98,bridge_name='br-int',has_traffic_filtering=True,id=70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b,network=Network(37630f0a-8aad-4e9a-8c81-a92f8d673f93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70bbf3ec-6e')
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.452 2 INFO nova.virt.libvirt.driver [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Deleting instance files /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0_del
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.453 2 INFO nova.virt.libvirt.driver [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Deletion of /var/lib/nova/instances/a7577fff-cf4c-4c47-a754-b5e0b86ad3e0_del complete
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.966 2 INFO nova.compute.manager [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.966 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.967 2 DEBUG nova.compute.manager [-] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.967 2 DEBUG nova.network.neutron [-] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:14:10 compute-0 nova_compute[192903]: 2025-10-06 14:14:10.968 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:14:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:11.373 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:11.373 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:11.375 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:11 compute-0 nova_compute[192903]: 2025-10-06 14:14:11.987 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:14:12 compute-0 nova_compute[192903]: 2025-10-06 14:14:12.123 2 DEBUG nova.compute.manager [req-58e41199-250f-4205-83e9-ce877c59ed67 req-6d22cea5-473a-4b26-9128-062eb5e89a85 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Received event network-vif-unplugged-70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:14:12 compute-0 nova_compute[192903]: 2025-10-06 14:14:12.125 2 DEBUG oslo_concurrency.lockutils [req-58e41199-250f-4205-83e9-ce877c59ed67 req-6d22cea5-473a-4b26-9128-062eb5e89a85 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:12 compute-0 nova_compute[192903]: 2025-10-06 14:14:12.125 2 DEBUG oslo_concurrency.lockutils [req-58e41199-250f-4205-83e9-ce877c59ed67 req-6d22cea5-473a-4b26-9128-062eb5e89a85 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:12 compute-0 nova_compute[192903]: 2025-10-06 14:14:12.125 2 DEBUG oslo_concurrency.lockutils [req-58e41199-250f-4205-83e9-ce877c59ed67 req-6d22cea5-473a-4b26-9128-062eb5e89a85 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:12 compute-0 nova_compute[192903]: 2025-10-06 14:14:12.125 2 DEBUG nova.compute.manager [req-58e41199-250f-4205-83e9-ce877c59ed67 req-6d22cea5-473a-4b26-9128-062eb5e89a85 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] No waiting events found dispatching network-vif-unplugged-70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:14:12 compute-0 nova_compute[192903]: 2025-10-06 14:14:12.125 2 DEBUG nova.compute.manager [req-58e41199-250f-4205-83e9-ce877c59ed67 req-6d22cea5-473a-4b26-9128-062eb5e89a85 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Received event network-vif-unplugged-70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:14:12 compute-0 podman[221994]: 2025-10-06 14:14:12.225389095 +0000 UTC m=+0.086254118 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:14:14 compute-0 nova_compute[192903]: 2025-10-06 14:14:14.057 2 DEBUG nova.compute.manager [req-4f7f2e1f-58be-4ebe-9b53-6108c722441b req-861e3c6f-a518-4a10-b9aa-e3eedf4c45f9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Received event network-vif-deleted-70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:14:14 compute-0 nova_compute[192903]: 2025-10-06 14:14:14.057 2 INFO nova.compute.manager [req-4f7f2e1f-58be-4ebe-9b53-6108c722441b req-861e3c6f-a518-4a10-b9aa-e3eedf4c45f9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Neutron deleted interface 70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b; detaching it from the instance and deleting it from the info cache
Oct 06 14:14:14 compute-0 nova_compute[192903]: 2025-10-06 14:14:14.058 2 DEBUG nova.network.neutron [req-4f7f2e1f-58be-4ebe-9b53-6108c722441b req-861e3c6f-a518-4a10-b9aa-e3eedf4c45f9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:14:14 compute-0 nova_compute[192903]: 2025-10-06 14:14:14.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:14 compute-0 nova_compute[192903]: 2025-10-06 14:14:14.505 2 DEBUG nova.network.neutron [-] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:14:14 compute-0 nova_compute[192903]: 2025-10-06 14:14:14.567 2 DEBUG nova.compute.manager [req-4f7f2e1f-58be-4ebe-9b53-6108c722441b req-861e3c6f-a518-4a10-b9aa-e3eedf4c45f9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Detach interface failed, port_id=70bbf3ec-6ea8-46e6-85ad-ff76cdbd1c6b, reason: Instance a7577fff-cf4c-4c47-a754-b5e0b86ad3e0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:14:15 compute-0 nova_compute[192903]: 2025-10-06 14:14:15.012 2 INFO nova.compute.manager [-] [instance: a7577fff-cf4c-4c47-a754-b5e0b86ad3e0] Took 4.05 seconds to deallocate network for instance.
Oct 06 14:14:15 compute-0 podman[222014]: 2025-10-06 14:14:15.211907303 +0000 UTC m=+0.076741236 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350)
Oct 06 14:14:15 compute-0 nova_compute[192903]: 2025-10-06 14:14:15.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:15 compute-0 nova_compute[192903]: 2025-10-06 14:14:15.530 2 DEBUG oslo_concurrency.lockutils [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:15 compute-0 nova_compute[192903]: 2025-10-06 14:14:15.531 2 DEBUG oslo_concurrency.lockutils [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:15 compute-0 nova_compute[192903]: 2025-10-06 14:14:15.586 2 DEBUG nova.compute.provider_tree [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:14:16 compute-0 nova_compute[192903]: 2025-10-06 14:14:16.094 2 DEBUG nova.scheduler.client.report [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:14:16 compute-0 nova_compute[192903]: 2025-10-06 14:14:16.611 2 DEBUG oslo_concurrency.lockutils [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.080s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:16 compute-0 nova_compute[192903]: 2025-10-06 14:14:16.641 2 INFO nova.scheduler.client.report [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Deleted allocations for instance a7577fff-cf4c-4c47-a754-b5e0b86ad3e0
Oct 06 14:14:17 compute-0 nova_compute[192903]: 2025-10-06 14:14:17.676 2 DEBUG oslo_concurrency.lockutils [None req-0f3eb1f9-1e87-4bfe-8e96-d63ba2eb9323 f242e9aec50346eaa7b3bddbda127d84 58ece9e5771a44c2918fd8f7783186f0 - - default default] Lock "a7577fff-cf4c-4c47-a754-b5e0b86ad3e0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.557s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:18.128 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:14:19 compute-0 nova_compute[192903]: 2025-10-06 14:14:19.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:20 compute-0 nova_compute[192903]: 2025-10-06 14:14:20.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:24 compute-0 nova_compute[192903]: 2025-10-06 14:14:24.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:25 compute-0 nova_compute[192903]: 2025-10-06 14:14:25.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:29 compute-0 nova_compute[192903]: 2025-10-06 14:14:29.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:29 compute-0 podman[203308]: time="2025-10-06T14:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:14:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:14:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 06 14:14:30 compute-0 nova_compute[192903]: 2025-10-06 14:14:30.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:31 compute-0 podman[222038]: 2025-10-06 14:14:31.215366177 +0000 UTC m=+0.074552199 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:14:31 compute-0 podman[222039]: 2025-10-06 14:14:31.221704055 +0000 UTC m=+0.078519174 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 06 14:14:31 compute-0 podman[222040]: 2025-10-06 14:14:31.235588381 +0000 UTC m=+0.079602272 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:14:31 compute-0 podman[222037]: 2025-10-06 14:14:31.265419638 +0000 UTC m=+0.124735143 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:14:31 compute-0 openstack_network_exporter[205500]: ERROR   14:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:14:31 compute-0 openstack_network_exporter[205500]: ERROR   14:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:14:31 compute-0 openstack_network_exporter[205500]: ERROR   14:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:14:31 compute-0 openstack_network_exporter[205500]: ERROR   14:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:14:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:14:31 compute-0 openstack_network_exporter[205500]: ERROR   14:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:14:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:14:34 compute-0 nova_compute[192903]: 2025-10-06 14:14:34.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:35 compute-0 nova_compute[192903]: 2025-10-06 14:14:35.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:36 compute-0 nova_compute[192903]: 2025-10-06 14:14:36.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:39 compute-0 nova_compute[192903]: 2025-10-06 14:14:39.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:40 compute-0 nova_compute[192903]: 2025-10-06 14:14:40.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:43 compute-0 podman[222122]: 2025-10-06 14:14:43.226734513 +0000 UTC m=+0.089251617 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:14:44 compute-0 nova_compute[192903]: 2025-10-06 14:14:44.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:44.847 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:ef:a1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-30a75e06-f298-492f-98eb-061c88d495a1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30a75e06-f298-492f-98eb-061c88d495a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ee59e8e222b4651ad77d73d5d791f85', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=296e815d-baab-4f43-a10f-44d4b1b0838d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=eb4fb73e-cdc3-487a-a06f-f3b909d0452c) old=Port_Binding(mac=['fa:16:3e:f6:ef:a1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-30a75e06-f298-492f-98eb-061c88d495a1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30a75e06-f298-492f-98eb-061c88d495a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ee59e8e222b4651ad77d73d5d791f85', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:14:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:44.848 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port eb4fb73e-cdc3-487a-a06f-f3b909d0452c in datapath 30a75e06-f298-492f-98eb-061c88d495a1 updated
Oct 06 14:14:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:44.849 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 30a75e06-f298-492f-98eb-061c88d495a1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:14:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:44.850 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[099836ef-a74a-45d1-b363-75bbf310e875]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:45 compute-0 nova_compute[192903]: 2025-10-06 14:14:45.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:46 compute-0 podman[222142]: 2025-10-06 14:14:46.201931183 +0000 UTC m=+0.070451001 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Oct 06 14:14:49 compute-0 nova_compute[192903]: 2025-10-06 14:14:49.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:50 compute-0 nova_compute[192903]: 2025-10-06 14:14:50.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:51.325 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:49:55 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7413d7c3-dab0-4605-8da2-36f84f26399c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7413d7c3-dab0-4605-8da2-36f84f26399c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '699e7494fd9346cabfcfbb104d9142b9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cf6f2e2-4dba-4ac4-90f0-3c2fbfe4af4d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9bbb2613-816d-4950-80dd-a597cb6e6f23) old=Port_Binding(mac=['fa:16:3e:77:49:55'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7413d7c3-dab0-4605-8da2-36f84f26399c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7413d7c3-dab0-4605-8da2-36f84f26399c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '699e7494fd9346cabfcfbb104d9142b9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:14:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:51.326 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9bbb2613-816d-4950-80dd-a597cb6e6f23 in datapath 7413d7c3-dab0-4605-8da2-36f84f26399c updated
Oct 06 14:14:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:51.328 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7413d7c3-dab0-4605-8da2-36f84f26399c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:14:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:14:51.329 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f6186f-ed18-45e9-8e50-fed69919476c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:14:54 compute-0 nova_compute[192903]: 2025-10-06 14:14:54.086 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:54 compute-0 nova_compute[192903]: 2025-10-06 14:14:54.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:54 compute-0 nova_compute[192903]: 2025-10-06 14:14:54.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.104 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.105 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.105 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.105 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.276 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.277 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.306 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.306 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.30218124389648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.307 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.307 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:14:55 compute-0 nova_compute[192903]: 2025-10-06 14:14:55.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:56 compute-0 nova_compute[192903]: 2025-10-06 14:14:56.528 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:14:56 compute-0 nova_compute[192903]: 2025-10-06 14:14:56.529 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:14:55 up  1:15,  0 user,  load average: 0.17, 0.38, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:14:56 compute-0 nova_compute[192903]: 2025-10-06 14:14:56.550 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:14:57 compute-0 nova_compute[192903]: 2025-10-06 14:14:57.058 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:14:57 compute-0 nova_compute[192903]: 2025-10-06 14:14:57.572 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:14:57 compute-0 nova_compute[192903]: 2025-10-06 14:14:57.573 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.266s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:14:59 compute-0 nova_compute[192903]: 2025-10-06 14:14:59.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:14:59 compute-0 nova_compute[192903]: 2025-10-06 14:14:59.568 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:59 compute-0 nova_compute[192903]: 2025-10-06 14:14:59.569 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:14:59 compute-0 podman[203308]: time="2025-10-06T14:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:14:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:14:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Oct 06 14:15:00 compute-0 nova_compute[192903]: 2025-10-06 14:15:00.078 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:15:00 compute-0 nova_compute[192903]: 2025-10-06 14:15:00.078 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:15:00 compute-0 nova_compute[192903]: 2025-10-06 14:15:00.078 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:15:00 compute-0 nova_compute[192903]: 2025-10-06 14:15:00.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:00 compute-0 nova_compute[192903]: 2025-10-06 14:15:00.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:15:01 compute-0 openstack_network_exporter[205500]: ERROR   14:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:15:01 compute-0 openstack_network_exporter[205500]: ERROR   14:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:15:01 compute-0 openstack_network_exporter[205500]: ERROR   14:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:15:01 compute-0 openstack_network_exporter[205500]: ERROR   14:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:15:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:15:01 compute-0 openstack_network_exporter[205500]: ERROR   14:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:15:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:15:02 compute-0 podman[222175]: 2025-10-06 14:15:02.237062783 +0000 UTC m=+0.078270957 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:15:02 compute-0 podman[222168]: 2025-10-06 14:15:02.263827239 +0000 UTC m=+0.112067079 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:15:02 compute-0 podman[222169]: 2025-10-06 14:15:02.270654869 +0000 UTC m=+0.114709188 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 06 14:15:02 compute-0 podman[222167]: 2025-10-06 14:15:02.27637187 +0000 UTC m=+0.137874900 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 14:15:02 compute-0 nova_compute[192903]: 2025-10-06 14:15:02.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:15:04 compute-0 nova_compute[192903]: 2025-10-06 14:15:04.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:04 compute-0 nova_compute[192903]: 2025-10-06 14:15:04.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:15:05 compute-0 nova_compute[192903]: 2025-10-06 14:15:05.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:09 compute-0 nova_compute[192903]: 2025-10-06 14:15:09.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:10 compute-0 ovn_controller[95205]: 2025-10-06T14:15:10Z|00152|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 06 14:15:10 compute-0 nova_compute[192903]: 2025-10-06 14:15:10.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:11.376 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:15:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:11.376 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:15:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:11.377 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:15:14 compute-0 nova_compute[192903]: 2025-10-06 14:15:14.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:14 compute-0 podman[222252]: 2025-10-06 14:15:14.232352192 +0000 UTC m=+0.100709129 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 06 14:15:15 compute-0 nova_compute[192903]: 2025-10-06 14:15:15.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:17.066 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:aa:b9 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa26a29b35704c20a2516da6a6faa917', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0ee47753-a40c-4a21-a6ed-65093b6727d9) old=Port_Binding(mac=['fa:16:3e:91:aa:b9'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa26a29b35704c20a2516da6a6faa917', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:15:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:17.067 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0ee47753-a40c-4a21-a6ed-65093b6727d9 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e updated
Oct 06 14:15:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:17.067 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:15:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:17.069 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7ffd4599-4e88-45b7-bae5-46d428f2af48]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:15:17 compute-0 podman[222274]: 2025-10-06 14:15:17.210081168 +0000 UTC m=+0.083007722 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Oct 06 14:15:19 compute-0 nova_compute[192903]: 2025-10-06 14:15:19.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:20 compute-0 nova_compute[192903]: 2025-10-06 14:15:20.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:24 compute-0 nova_compute[192903]: 2025-10-06 14:15:24.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:25 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:25.123 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:f1:fe 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0d7537d0-8bf9-4259-aecc-c1fdbae4742d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d7537d0-8bf9-4259-aecc-c1fdbae4742d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3f04da8-a772-4cc0-829d-8b56f0254ca9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=682aca11-dadc-4486-8044-9e6869c24786) old=Port_Binding(mac=['fa:16:3e:91:f1:fe'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0d7537d0-8bf9-4259-aecc-c1fdbae4742d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d7537d0-8bf9-4259-aecc-c1fdbae4742d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:15:25 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:25.124 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 682aca11-dadc-4486-8044-9e6869c24786 in datapath 0d7537d0-8bf9-4259-aecc-c1fdbae4742d updated
Oct 06 14:15:25 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:25.125 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0d7537d0-8bf9-4259-aecc-c1fdbae4742d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:15:25 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:25.126 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[296e513a-511c-4e2d-88fe-2936010244b4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:15:25 compute-0 nova_compute[192903]: 2025-10-06 14:15:25.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:28.101 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:15:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:28.102 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:15:28 compute-0 nova_compute[192903]: 2025-10-06 14:15:28.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:29 compute-0 nova_compute[192903]: 2025-10-06 14:15:29.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:29 compute-0 podman[203308]: time="2025-10-06T14:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:15:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:15:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Oct 06 14:15:30 compute-0 nova_compute[192903]: 2025-10-06 14:15:30.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:31 compute-0 openstack_network_exporter[205500]: ERROR   14:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:15:31 compute-0 openstack_network_exporter[205500]: ERROR   14:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:15:31 compute-0 openstack_network_exporter[205500]: ERROR   14:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:15:31 compute-0 openstack_network_exporter[205500]: ERROR   14:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:15:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:15:31 compute-0 openstack_network_exporter[205500]: ERROR   14:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:15:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:15:32 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:15:32.104 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:15:33 compute-0 podman[222300]: 2025-10-06 14:15:33.231092896 +0000 UTC m=+0.071706743 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:15:33 compute-0 podman[222298]: 2025-10-06 14:15:33.239304783 +0000 UTC m=+0.082695433 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:15:33 compute-0 podman[222299]: 2025-10-06 14:15:33.265224887 +0000 UTC m=+0.103950945 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:15:33 compute-0 podman[222297]: 2025-10-06 14:15:33.265911035 +0000 UTC m=+0.118056606 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:15:34 compute-0 nova_compute[192903]: 2025-10-06 14:15:34.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:35 compute-0 nova_compute[192903]: 2025-10-06 14:15:35.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:39 compute-0 nova_compute[192903]: 2025-10-06 14:15:39.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:40 compute-0 nova_compute[192903]: 2025-10-06 14:15:40.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:44 compute-0 nova_compute[192903]: 2025-10-06 14:15:44.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:45 compute-0 podman[222381]: 2025-10-06 14:15:45.197898626 +0000 UTC m=+0.063473217 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:15:45 compute-0 nova_compute[192903]: 2025-10-06 14:15:45.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:48 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 06 14:15:48 compute-0 podman[222400]: 2025-10-06 14:15:48.17900703 +0000 UTC m=+0.067613636 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_id=edpm, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 06 14:15:49 compute-0 nova_compute[192903]: 2025-10-06 14:15:49.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:50 compute-0 nova_compute[192903]: 2025-10-06 14:15:50.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:54 compute-0 nova_compute[192903]: 2025-10-06 14:15:54.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:54 compute-0 nova_compute[192903]: 2025-10-06 14:15:54.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:15:54 compute-0 nova_compute[192903]: 2025-10-06 14:15:54.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.118 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.119 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.119 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.119 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.264 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.265 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.282 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.283 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5875MB free_disk=73.30218124389648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.283 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.283 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:15:55 compute-0 nova_compute[192903]: 2025-10-06 14:15:55.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:56 compute-0 nova_compute[192903]: 2025-10-06 14:15:56.358 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:15:56 compute-0 nova_compute[192903]: 2025-10-06 14:15:56.358 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:15:55 up  1:16,  0 user,  load average: 0.06, 0.31, 0.39\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:15:56 compute-0 nova_compute[192903]: 2025-10-06 14:15:56.428 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:15:56 compute-0 nova_compute[192903]: 2025-10-06 14:15:56.958 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:15:57 compute-0 nova_compute[192903]: 2025-10-06 14:15:57.495 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:15:57 compute-0 nova_compute[192903]: 2025-10-06 14:15:57.495 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.212s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:15:58 compute-0 nova_compute[192903]: 2025-10-06 14:15:58.453 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:15:58 compute-0 nova_compute[192903]: 2025-10-06 14:15:58.454 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:15:58 compute-0 nova_compute[192903]: 2025-10-06 14:15:58.958 2 DEBUG nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:15:59 compute-0 nova_compute[192903]: 2025-10-06 14:15:59.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:15:59 compute-0 nova_compute[192903]: 2025-10-06 14:15:59.561 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:15:59 compute-0 nova_compute[192903]: 2025-10-06 14:15:59.562 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:15:59 compute-0 nova_compute[192903]: 2025-10-06 14:15:59.566 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:15:59 compute-0 nova_compute[192903]: 2025-10-06 14:15:59.566 2 INFO nova.compute.claims [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:15:59 compute-0 podman[203308]: time="2025-10-06T14:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:15:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:15:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3003 "" "Go-http-client/1.1"
Oct 06 14:16:00 compute-0 nova_compute[192903]: 2025-10-06 14:16:00.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:00 compute-0 nova_compute[192903]: 2025-10-06 14:16:00.671 2 DEBUG nova.compute.provider_tree [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:16:01 compute-0 nova_compute[192903]: 2025-10-06 14:16:01.179 2 DEBUG nova.scheduler.client.report [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:16:01 compute-0 openstack_network_exporter[205500]: ERROR   14:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:16:01 compute-0 openstack_network_exporter[205500]: ERROR   14:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:16:01 compute-0 openstack_network_exporter[205500]: ERROR   14:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:16:01 compute-0 openstack_network_exporter[205500]: ERROR   14:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:16:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:16:01 compute-0 openstack_network_exporter[205500]: ERROR   14:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:16:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:16:01 compute-0 nova_compute[192903]: 2025-10-06 14:16:01.689 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:01 compute-0 nova_compute[192903]: 2025-10-06 14:16:01.689 2 DEBUG nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.201 2 DEBUG nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.201 2 DEBUG nova.network.neutron [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.201 2 WARNING neutronclient.v2_0.client [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.202 2 WARNING neutronclient.v2_0.client [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.492 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.492 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.493 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.493 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.494 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:16:02 compute-0 nova_compute[192903]: 2025-10-06 14:16:02.709 2 INFO nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:16:03 compute-0 nova_compute[192903]: 2025-10-06 14:16:03.219 2 DEBUG nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:16:04 compute-0 podman[222426]: 2025-10-06 14:16:04.187874258 +0000 UTC m=+0.050409031 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930)
Oct 06 14:16:04 compute-0 podman[222425]: 2025-10-06 14:16:04.189569043 +0000 UTC m=+0.055305651 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 06 14:16:04 compute-0 podman[222424]: 2025-10-06 14:16:04.213811273 +0000 UTC m=+0.083664609 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 06 14:16:04 compute-0 podman[222427]: 2025-10-06 14:16:04.225899762 +0000 UTC m=+0.080689871 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.244 2 DEBUG nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.245 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.245 2 INFO nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Creating image(s)
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.246 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.246 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.247 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.247 2 DEBUG oslo_utils.imageutils.format_inspector [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.250 2 DEBUG oslo_utils.imageutils.format_inspector [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.252 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.296 2 DEBUG nova.network.neutron [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Successfully created port: 18d48c5d-b383-4b4b-9188-a8aac7e21179 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.300 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.300 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.300 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.301 2 DEBUG oslo_utils.imageutils.format_inspector [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.303 2 DEBUG oslo_utils.imageutils.format_inspector [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.304 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.351 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.352 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.398 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.399 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.400 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.455 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.456 2 DEBUG nova.virt.disk.api [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Checking if we can resize image /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.456 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.523 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.524 2 DEBUG nova.virt.disk.api [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Cannot resize image /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.524 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.525 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Ensure instance console log exists: /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.525 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.525 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.526 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:04 compute-0 nova_compute[192903]: 2025-10-06 14:16:04.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:16:05 compute-0 nova_compute[192903]: 2025-10-06 14:16:05.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:05 compute-0 nova_compute[192903]: 2025-10-06 14:16:05.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:16:06 compute-0 nova_compute[192903]: 2025-10-06 14:16:06.242 2 DEBUG nova.network.neutron [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Successfully updated port: 18d48c5d-b383-4b4b-9188-a8aac7e21179 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:16:06 compute-0 nova_compute[192903]: 2025-10-06 14:16:06.309 2 DEBUG nova.compute.manager [req-1853c093-6211-476b-8c2b-169b145d9d52 req-b4cdcbdf-f2c7-486b-8e1f-4cb9841d150d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-changed-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:16:06 compute-0 nova_compute[192903]: 2025-10-06 14:16:06.310 2 DEBUG nova.compute.manager [req-1853c093-6211-476b-8c2b-169b145d9d52 req-b4cdcbdf-f2c7-486b-8e1f-4cb9841d150d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Refreshing instance network info cache due to event network-changed-18d48c5d-b383-4b4b-9188-a8aac7e21179. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:16:06 compute-0 nova_compute[192903]: 2025-10-06 14:16:06.310 2 DEBUG oslo_concurrency.lockutils [req-1853c093-6211-476b-8c2b-169b145d9d52 req-b4cdcbdf-f2c7-486b-8e1f-4cb9841d150d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-94963e04-a73a-4d6e-8f87-59453794973a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:16:06 compute-0 nova_compute[192903]: 2025-10-06 14:16:06.310 2 DEBUG oslo_concurrency.lockutils [req-1853c093-6211-476b-8c2b-169b145d9d52 req-b4cdcbdf-f2c7-486b-8e1f-4cb9841d150d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-94963e04-a73a-4d6e-8f87-59453794973a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:16:06 compute-0 nova_compute[192903]: 2025-10-06 14:16:06.310 2 DEBUG nova.network.neutron [req-1853c093-6211-476b-8c2b-169b145d9d52 req-b4cdcbdf-f2c7-486b-8e1f-4cb9841d150d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Refreshing network info cache for port 18d48c5d-b383-4b4b-9188-a8aac7e21179 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:16:06 compute-0 nova_compute[192903]: 2025-10-06 14:16:06.749 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "refresh_cache-94963e04-a73a-4d6e-8f87-59453794973a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:16:06 compute-0 nova_compute[192903]: 2025-10-06 14:16:06.817 2 WARNING neutronclient.v2_0.client [req-1853c093-6211-476b-8c2b-169b145d9d52 req-b4cdcbdf-f2c7-486b-8e1f-4cb9841d150d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:16:07 compute-0 nova_compute[192903]: 2025-10-06 14:16:07.055 2 DEBUG nova.network.neutron [req-1853c093-6211-476b-8c2b-169b145d9d52 req-b4cdcbdf-f2c7-486b-8e1f-4cb9841d150d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:16:07 compute-0 nova_compute[192903]: 2025-10-06 14:16:07.265 2 DEBUG nova.network.neutron [req-1853c093-6211-476b-8c2b-169b145d9d52 req-b4cdcbdf-f2c7-486b-8e1f-4cb9841d150d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:16:07 compute-0 nova_compute[192903]: 2025-10-06 14:16:07.773 2 DEBUG oslo_concurrency.lockutils [req-1853c093-6211-476b-8c2b-169b145d9d52 req-b4cdcbdf-f2c7-486b-8e1f-4cb9841d150d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-94963e04-a73a-4d6e-8f87-59453794973a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:16:07 compute-0 nova_compute[192903]: 2025-10-06 14:16:07.774 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquired lock "refresh_cache-94963e04-a73a-4d6e-8f87-59453794973a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:16:07 compute-0 nova_compute[192903]: 2025-10-06 14:16:07.774 2 DEBUG nova.network.neutron [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:16:09 compute-0 nova_compute[192903]: 2025-10-06 14:16:09.094 2 DEBUG nova.network.neutron [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:16:09 compute-0 nova_compute[192903]: 2025-10-06 14:16:09.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.117 2 WARNING neutronclient.v2_0.client [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.279 2 DEBUG nova.network.neutron [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Updating instance_info_cache with network_info: [{"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.788 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Releasing lock "refresh_cache-94963e04-a73a-4d6e-8f87-59453794973a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.789 2 DEBUG nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Instance network_info: |[{"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.791 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Start _get_guest_xml network_info=[{"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.797 2 WARNING nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.799 2 DEBUG nova.virt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1076227946', uuid='94963e04-a73a-4d6e-8f87-59453794973a'), owner=OwnerMeta(userid='98ee6da236ba42baa0fef11dcb52cbdd', username='tempest-TestExecuteStrategies-1255317741-project-admin', projectid='8f3f3b7d20fc4715811486da569fc0ab', projectname='tempest-TestExecuteStrategies-1255317741'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759760170.7990685) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.804 2 DEBUG nova.virt.libvirt.host [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.804 2 DEBUG nova.virt.libvirt.host [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.812 2 DEBUG nova.virt.libvirt.host [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.813 2 DEBUG nova.virt.libvirt.host [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.814 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.814 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.814 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.815 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.815 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.815 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.815 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.815 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.816 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.816 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.816 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.816 2 DEBUG nova.virt.hardware [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.820 2 DEBUG nova.virt.libvirt.vif [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:15:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1076227946',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1076227946',id=17,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-zdlmcqkz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:16:03Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=94963e04-a73a-4d6e-8f87-59453794973a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.820 2 DEBUG nova.network.os_vif_util [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.821 2 DEBUG nova.network.os_vif_util [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:36:88,bridge_name='br-int',has_traffic_filtering=True,id=18d48c5d-b383-4b4b-9188-a8aac7e21179,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18d48c5d-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:16:10 compute-0 nova_compute[192903]: 2025-10-06 14:16:10.822 2 DEBUG nova.objects.instance [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'pci_devices' on Instance uuid 94963e04-a73a-4d6e-8f87-59453794973a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.334 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <uuid>94963e04-a73a-4d6e-8f87-59453794973a</uuid>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <name>instance-00000011</name>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-1076227946</nova:name>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:16:10</nova:creationTime>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:16:11 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:16:11 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         <nova:port uuid="18d48c5d-b383-4b4b-9188-a8aac7e21179">
Oct 06 14:16:11 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <system>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <entry name="serial">94963e04-a73a-4d6e-8f87-59453794973a</entry>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <entry name="uuid">94963e04-a73a-4d6e-8f87-59453794973a</entry>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     </system>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <os>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   </os>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <features>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   </features>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.config"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:45:36:88"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <target dev="tap18d48c5d-b3"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/console.log" append="off"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <video>
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     </video>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:16:11 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:16:11 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:16:11 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:16:11 compute-0 nova_compute[192903]: </domain>
Oct 06 14:16:11 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.336 2 DEBUG nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Preparing to wait for external event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.336 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.336 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.337 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.337 2 DEBUG nova.virt.libvirt.vif [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:15:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1076227946',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1076227946',id=17,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-zdlmcqkz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:16:03Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=94963e04-a73a-4d6e-8f87-59453794973a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.338 2 DEBUG nova.network.os_vif_util [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.338 2 DEBUG nova.network.os_vif_util [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:36:88,bridge_name='br-int',has_traffic_filtering=True,id=18d48c5d-b383-4b4b-9188-a8aac7e21179,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18d48c5d-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.339 2 DEBUG os_vif [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:36:88,bridge_name='br-int',has_traffic_filtering=True,id=18d48c5d-b383-4b4b-9188-a8aac7e21179,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18d48c5d-b3') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.340 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.340 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.341 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '41ce273b-c975-59dc-aeaf-8eebc39342c7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:16:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:11.378 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:11.378 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:11.379 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.402 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18d48c5d-b3, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.403 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap18d48c5d-b3, col_values=(('qos', UUID('3e3a8448-c1ca-405a-8cad-797181c0199d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.403 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap18d48c5d-b3, col_values=(('external_ids', {'iface-id': '18d48c5d-b383-4b4b-9188-a8aac7e21179', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:36:88', 'vm-uuid': '94963e04-a73a-4d6e-8f87-59453794973a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:11 compute-0 NetworkManager[52035]: <info>  [1759760171.4056] manager: (tap18d48c5d-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:11 compute-0 nova_compute[192903]: 2025-10-06 14:16:11.416 2 INFO os_vif [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:36:88,bridge_name='br-int',has_traffic_filtering=True,id=18d48c5d-b383-4b4b-9188-a8aac7e21179,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18d48c5d-b3')
Oct 06 14:16:12 compute-0 nova_compute[192903]: 2025-10-06 14:16:12.965 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:16:12 compute-0 nova_compute[192903]: 2025-10-06 14:16:12.965 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:16:12 compute-0 nova_compute[192903]: 2025-10-06 14:16:12.966 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No VIF found with MAC fa:16:3e:45:36:88, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:16:12 compute-0 nova_compute[192903]: 2025-10-06 14:16:12.966 2 INFO nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Using config drive
Oct 06 14:16:13 compute-0 nova_compute[192903]: 2025-10-06 14:16:13.477 2 WARNING neutronclient.v2_0.client [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.172 2 INFO nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Creating config drive at /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.config
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.181 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp3rwetfuf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.322 2 DEBUG oslo_concurrency.processutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp3rwetfuf" returned: 0 in 0.141s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:14 compute-0 kernel: tap18d48c5d-b3: entered promiscuous mode
Oct 06 14:16:14 compute-0 NetworkManager[52035]: <info>  [1759760174.4018] manager: (tap18d48c5d-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Oct 06 14:16:14 compute-0 ovn_controller[95205]: 2025-10-06T14:16:14Z|00153|binding|INFO|Claiming lport 18d48c5d-b383-4b4b-9188-a8aac7e21179 for this chassis.
Oct 06 14:16:14 compute-0 ovn_controller[95205]: 2025-10-06T14:16:14Z|00154|binding|INFO|18d48c5d-b383-4b4b-9188-a8aac7e21179: Claiming fa:16:3e:45:36:88 10.100.0.9
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.428 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:36:88 10.100.0.9'], port_security=['fa:16:3e:45:36:88 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '94963e04-a73a-4d6e-8f87-59453794973a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=18d48c5d-b383-4b4b-9188-a8aac7e21179) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:16:14 compute-0 systemd-udevd[222541]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.430 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 18d48c5d-b383-4b4b-9188-a8aac7e21179 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e bound to our chassis
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.431 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:16:14 compute-0 NetworkManager[52035]: <info>  [1759760174.4444] device (tap18d48c5d-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:16:14 compute-0 NetworkManager[52035]: <info>  [1759760174.4456] device (tap18d48c5d-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.450 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[71295adf-fa33-4092-8329-91764c57d367]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.451 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55ccf1b2-d1 in ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.454 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55ccf1b2-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.454 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[76698607-fe6f-4393-9da0-5ef20b4017d6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.456 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[36581b18-adfe-4ced-9b6f-39e77115b162]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 systemd-machined[152985]: New machine qemu-13-instance-00000011.
Oct 06 14:16:14 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000011.
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.471 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[06308aa1-05da-4df7-8c75-1e38ed931f6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.481 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a9eed9-06d4-4037-b3e7-8b6c125b1d31]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_controller[95205]: 2025-10-06T14:16:14Z|00155|binding|INFO|Setting lport 18d48c5d-b383-4b4b-9188-a8aac7e21179 ovn-installed in OVS
Oct 06 14:16:14 compute-0 ovn_controller[95205]: 2025-10-06T14:16:14Z|00156|binding|INFO|Setting lport 18d48c5d-b383-4b4b-9188-a8aac7e21179 up in Southbound
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.509 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[f2e6f34e-bf5b-4904-a589-537ed29c6260]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.519 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b08bae99-3ba2-4015-90a5-ad17f20ee1ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 NetworkManager[52035]: <info>  [1759760174.5215] manager: (tap55ccf1b2-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.553 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[60e828ac-1372-4261-bc65-4f4fbf574198]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.556 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[9837fa03-bf33-42fa-8c42-b88927fad3da]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 NetworkManager[52035]: <info>  [1759760174.5749] device (tap55ccf1b2-d0): carrier: link connected
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.579 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[904b24ff-dd50-4aba-aec9-f5e4320a532e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.595 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c7bd5cb8-ef4d-4d26-9260-5f74b10706bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463621, 'reachable_time': 41326, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222575, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.612 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb208f4-cb55-44df-b36b-8b79058f38fd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:aab9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463621, 'tstamp': 463621}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222576, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.633 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6d35e288-b99f-4c70-a5f3-003f2daf564a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463621, 'reachable_time': 41326, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222577, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.663 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9b70f1b4-b4f6-4093-bd28-e8c8d6c7d7ee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.705 2 DEBUG nova.compute.manager [req-e0191518-162b-4551-9bed-8107420a0018 req-601ad4f6-3358-4dc6-9fc4-82e2066afb2e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.706 2 DEBUG oslo_concurrency.lockutils [req-e0191518-162b-4551-9bed-8107420a0018 req-601ad4f6-3358-4dc6-9fc4-82e2066afb2e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.706 2 DEBUG oslo_concurrency.lockutils [req-e0191518-162b-4551-9bed-8107420a0018 req-601ad4f6-3358-4dc6-9fc4-82e2066afb2e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.706 2 DEBUG oslo_concurrency.lockutils [req-e0191518-162b-4551-9bed-8107420a0018 req-601ad4f6-3358-4dc6-9fc4-82e2066afb2e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.706 2 DEBUG nova.compute.manager [req-e0191518-162b-4551-9bed-8107420a0018 req-601ad4f6-3358-4dc6-9fc4-82e2066afb2e e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Processing event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.738 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4f201830-b585-4b5f-a00b-c2c8348de8b0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.740 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.740 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.741 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 NetworkManager[52035]: <info>  [1759760174.7476] manager: (tap55ccf1b2-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Oct 06 14:16:14 compute-0 kernel: tap55ccf1b2-d0: entered promiscuous mode
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.750 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 ovn_controller[95205]: 2025-10-06T14:16:14Z|00157|binding|INFO|Releasing lport 0ee47753-a40c-4a21-a6ed-65093b6727d9 from this chassis (sb_readonly=0)
Oct 06 14:16:14 compute-0 nova_compute[192903]: 2025-10-06 14:16:14.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.764 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[93d1974f-2aaf-4af6-92dc-56c558ffb120]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.765 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.765 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.765 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 55ccf1b2-d24e-4063-b15b-60a65227d75e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.765 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.766 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[db95cbb2-0e44-4113-8e83-ddf5f634192f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.767 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.768 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a513c925-56c3-494d-ab11-34f99b0bcf02]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.768 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:16:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:14.770 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'env', 'PROCESS_TAG=haproxy-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55ccf1b2-d24e-4063-b15b-60a65227d75e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:16:15 compute-0 podman[222616]: 2025-10-06 14:16:15.19470218 +0000 UTC m=+0.052963149 container create 00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:16:15 compute-0 systemd[1]: Started libpod-conmon-00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56.scope.
Oct 06 14:16:15 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:16:15 compute-0 podman[222616]: 2025-10-06 14:16:15.163831435 +0000 UTC m=+0.022092424 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:16:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73b84f5670d1889864f76346dfcac32244d0fde16c6d45dfc7ecd904fd08186c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:16:15 compute-0 podman[222616]: 2025-10-06 14:16:15.282108737 +0000 UTC m=+0.140369756 container init 00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest)
Oct 06 14:16:15 compute-0 podman[222616]: 2025-10-06 14:16:15.29434745 +0000 UTC m=+0.152608429 container start 00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:16:15 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[222632]: [NOTICE]   (222648) : New worker (222656) forked
Oct 06 14:16:15 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[222632]: [NOTICE]   (222648) : Loading success.
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.354 2 DEBUG nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:16:15 compute-0 podman[222631]: 2025-10-06 14:16:15.357143537 +0000 UTC m=+0.099260261 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.362 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.367 2 INFO nova.virt.libvirt.driver [-] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Instance spawned successfully.
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.368 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.886 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.887 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.887 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.888 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.888 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:16:15 compute-0 nova_compute[192903]: 2025-10-06 14:16:15.889 2 DEBUG nova.virt.libvirt.driver [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:16:16 compute-0 nova_compute[192903]: 2025-10-06 14:16:16.405 2 INFO nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Took 12.16 seconds to spawn the instance on the hypervisor.
Oct 06 14:16:16 compute-0 nova_compute[192903]: 2025-10-06 14:16:16.406 2 DEBUG nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:16:16 compute-0 nova_compute[192903]: 2025-10-06 14:16:16.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:16 compute-0 nova_compute[192903]: 2025-10-06 14:16:16.762 2 DEBUG nova.compute.manager [req-b9f089cf-3c2f-4863-90fe-269cadea5e92 req-10142f4b-221a-47dd-9bca-d8ed14726970 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:16:16 compute-0 nova_compute[192903]: 2025-10-06 14:16:16.763 2 DEBUG oslo_concurrency.lockutils [req-b9f089cf-3c2f-4863-90fe-269cadea5e92 req-10142f4b-221a-47dd-9bca-d8ed14726970 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:16 compute-0 nova_compute[192903]: 2025-10-06 14:16:16.763 2 DEBUG oslo_concurrency.lockutils [req-b9f089cf-3c2f-4863-90fe-269cadea5e92 req-10142f4b-221a-47dd-9bca-d8ed14726970 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:16 compute-0 nova_compute[192903]: 2025-10-06 14:16:16.763 2 DEBUG oslo_concurrency.lockutils [req-b9f089cf-3c2f-4863-90fe-269cadea5e92 req-10142f4b-221a-47dd-9bca-d8ed14726970 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:16 compute-0 nova_compute[192903]: 2025-10-06 14:16:16.764 2 DEBUG nova.compute.manager [req-b9f089cf-3c2f-4863-90fe-269cadea5e92 req-10142f4b-221a-47dd-9bca-d8ed14726970 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] No waiting events found dispatching network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:16:16 compute-0 nova_compute[192903]: 2025-10-06 14:16:16.764 2 WARNING nova.compute.manager [req-b9f089cf-3c2f-4863-90fe-269cadea5e92 req-10142f4b-221a-47dd-9bca-d8ed14726970 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received unexpected event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 for instance with vm_state active and task_state None.
Oct 06 14:16:17 compute-0 nova_compute[192903]: 2025-10-06 14:16:17.061 2 INFO nova.compute.manager [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Took 17.60 seconds to build instance.
Oct 06 14:16:17 compute-0 nova_compute[192903]: 2025-10-06 14:16:17.571 2 DEBUG oslo_concurrency.lockutils [None req-3b009ccf-59d6-4bab-b6d5-9956ac8415cf 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:19 compute-0 podman[222665]: 2025-10-06 14:16:19.207414714 +0000 UTC m=+0.069959588 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6)
Oct 06 14:16:19 compute-0 nova_compute[192903]: 2025-10-06 14:16:19.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:21 compute-0 nova_compute[192903]: 2025-10-06 14:16:21.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:24 compute-0 nova_compute[192903]: 2025-10-06 14:16:24.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:26 compute-0 nova_compute[192903]: 2025-10-06 14:16:26.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:28 compute-0 ovn_controller[95205]: 2025-10-06T14:16:28Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:36:88 10.100.0.9
Oct 06 14:16:28 compute-0 ovn_controller[95205]: 2025-10-06T14:16:28Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:36:88 10.100.0.9
Oct 06 14:16:29 compute-0 nova_compute[192903]: 2025-10-06 14:16:29.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:29 compute-0 podman[203308]: time="2025-10-06T14:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:16:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:16:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Oct 06 14:16:31 compute-0 openstack_network_exporter[205500]: ERROR   14:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:16:31 compute-0 openstack_network_exporter[205500]: ERROR   14:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:16:31 compute-0 openstack_network_exporter[205500]: ERROR   14:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:16:31 compute-0 openstack_network_exporter[205500]: ERROR   14:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:16:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:16:31 compute-0 openstack_network_exporter[205500]: ERROR   14:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:16:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:16:31 compute-0 nova_compute[192903]: 2025-10-06 14:16:31.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:34 compute-0 nova_compute[192903]: 2025-10-06 14:16:34.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:35 compute-0 podman[222703]: 2025-10-06 14:16:35.251349036 +0000 UTC m=+0.083884635 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:16:35 compute-0 podman[222701]: 2025-10-06 14:16:35.266448195 +0000 UTC m=+0.115864180 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 06 14:16:35 compute-0 podman[222700]: 2025-10-06 14:16:35.281490412 +0000 UTC m=+0.129910490 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:16:35 compute-0 podman[222702]: 2025-10-06 14:16:35.285811596 +0000 UTC m=+0.121625322 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 06 14:16:36 compute-0 nova_compute[192903]: 2025-10-06 14:16:36.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:37 compute-0 nova_compute[192903]: 2025-10-06 14:16:37.150 2 DEBUG nova.compute.manager [None req-e8a4f174-294c-4452-a123-e6c30e85a2af f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Oct 06 14:16:37 compute-0 nova_compute[192903]: 2025-10-06 14:16:37.214 2 DEBUG nova.compute.provider_tree [None req-e8a4f174-294c-4452-a123-e6c30e85a2af f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Updating resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 generation from 13 to 17 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 06 14:16:39 compute-0 nova_compute[192903]: 2025-10-06 14:16:39.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:41 compute-0 nova_compute[192903]: 2025-10-06 14:16:41.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:44 compute-0 nova_compute[192903]: 2025-10-06 14:16:44.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:44 compute-0 ovn_controller[95205]: 2025-10-06T14:16:44Z|00158|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Oct 06 14:16:44 compute-0 nova_compute[192903]: 2025-10-06 14:16:44.801 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Check if temp file /var/lib/nova/instances/tmpw1vltcjj exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Oct 06 14:16:44 compute-0 nova_compute[192903]: 2025-10-06 14:16:44.808 2 DEBUG nova.compute.manager [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw1vltcjj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='94963e04-a73a-4d6e-8f87-59453794973a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Oct 06 14:16:46 compute-0 podman[222789]: 2025-10-06 14:16:46.246278135 +0000 UTC m=+0.106995625 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid)
Oct 06 14:16:46 compute-0 nova_compute[192903]: 2025-10-06 14:16:46.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:48 compute-0 nova_compute[192903]: 2025-10-06 14:16:48.672 2 DEBUG oslo_concurrency.processutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:48 compute-0 nova_compute[192903]: 2025-10-06 14:16:48.737 2 DEBUG oslo_concurrency.processutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:48 compute-0 nova_compute[192903]: 2025-10-06 14:16:48.739 2 DEBUG oslo_concurrency.processutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:48 compute-0 nova_compute[192903]: 2025-10-06 14:16:48.801 2 DEBUG oslo_concurrency.processutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:48 compute-0 nova_compute[192903]: 2025-10-06 14:16:48.803 2 DEBUG nova.compute.manager [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Preparing to wait for external event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:16:48 compute-0 nova_compute[192903]: 2025-10-06 14:16:48.803 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:48 compute-0 nova_compute[192903]: 2025-10-06 14:16:48.803 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:48 compute-0 nova_compute[192903]: 2025-10-06 14:16:48.804 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:49 compute-0 nova_compute[192903]: 2025-10-06 14:16:49.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:50 compute-0 podman[222815]: 2025-10-06 14:16:50.224920279 +0000 UTC m=+0.082733545 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 06 14:16:51 compute-0 nova_compute[192903]: 2025-10-06 14:16:51.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:54 compute-0 nova_compute[192903]: 2025-10-06 14:16:54.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:54.499 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:16:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:16:54.501 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:16:54 compute-0 nova_compute[192903]: 2025-10-06 14:16:54.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:54 compute-0 nova_compute[192903]: 2025-10-06 14:16:54.526 2 DEBUG nova.compute.manager [req-4d132627-c9d7-4111-9e83-c6a96fd2b02e req-47cce14c-632d-48e3-9330-cf14bfcbb946 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-unplugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:16:54 compute-0 nova_compute[192903]: 2025-10-06 14:16:54.527 2 DEBUG oslo_concurrency.lockutils [req-4d132627-c9d7-4111-9e83-c6a96fd2b02e req-47cce14c-632d-48e3-9330-cf14bfcbb946 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:54 compute-0 nova_compute[192903]: 2025-10-06 14:16:54.527 2 DEBUG oslo_concurrency.lockutils [req-4d132627-c9d7-4111-9e83-c6a96fd2b02e req-47cce14c-632d-48e3-9330-cf14bfcbb946 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:54 compute-0 nova_compute[192903]: 2025-10-06 14:16:54.528 2 DEBUG oslo_concurrency.lockutils [req-4d132627-c9d7-4111-9e83-c6a96fd2b02e req-47cce14c-632d-48e3-9330-cf14bfcbb946 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:54 compute-0 nova_compute[192903]: 2025-10-06 14:16:54.528 2 DEBUG nova.compute.manager [req-4d132627-c9d7-4111-9e83-c6a96fd2b02e req-47cce14c-632d-48e3-9330-cf14bfcbb946 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] No event matching network-vif-unplugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 in dict_keys([('network-vif-plugged', '18d48c5d-b383-4b4b-9188-a8aac7e21179')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Oct 06 14:16:54 compute-0 nova_compute[192903]: 2025-10-06 14:16:54.529 2 DEBUG nova.compute.manager [req-4d132627-c9d7-4111-9e83-c6a96fd2b02e req-47cce14c-632d-48e3-9330-cf14bfcbb946 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-unplugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:16:54 compute-0 nova_compute[192903]: 2025-10-06 14:16:54.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:16:55 compute-0 nova_compute[192903]: 2025-10-06 14:16:55.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.097 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.347 2 INFO nova.compute.manager [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Took 7.54 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.585 2 DEBUG nova.compute.manager [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.585 2 DEBUG oslo_concurrency.lockutils [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.585 2 DEBUG oslo_concurrency.lockutils [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.586 2 DEBUG oslo_concurrency.lockutils [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.586 2 DEBUG nova.compute.manager [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Processing event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.586 2 DEBUG nova.compute.manager [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-changed-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.586 2 DEBUG nova.compute.manager [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Refreshing instance network info cache due to event network-changed-18d48c5d-b383-4b4b-9188-a8aac7e21179. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.586 2 DEBUG oslo_concurrency.lockutils [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-94963e04-a73a-4d6e-8f87-59453794973a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.587 2 DEBUG oslo_concurrency.lockutils [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-94963e04-a73a-4d6e-8f87-59453794973a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.587 2 DEBUG nova.network.neutron [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Refreshing network info cache for port 18d48c5d-b383-4b4b-9188-a8aac7e21179 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:16:56 compute-0 nova_compute[192903]: 2025-10-06 14:16:56.589 2 DEBUG nova.compute.manager [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.094 2 WARNING neutronclient.v2_0.client [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.099 2 DEBUG nova.compute.manager [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw1vltcjj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='94963e04-a73a-4d6e-8f87-59453794973a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(7f5fc136-7851-4fb5-8a7d-b39adadee57a),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.142 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.207 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.208 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.273 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.413 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.415 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.437 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.438 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5656MB free_disk=73.27326583862305GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.438 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.438 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.614 2 DEBUG nova.objects.instance [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid 94963e04-a73a-4d6e-8f87-59453794973a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.615 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.620 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.620 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.689 2 WARNING neutronclient.v2_0.client [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.841 2 DEBUG nova.network.neutron [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Updated VIF entry in instance network info cache for port 18d48c5d-b383-4b4b-9188-a8aac7e21179. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 06 14:16:57 compute-0 nova_compute[192903]: 2025-10-06 14:16:57.841 2 DEBUG nova.network.neutron [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Updating instance_info_cache with network_info: [{"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.123 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.124 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.128 2 DEBUG nova.virt.libvirt.vif [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:15:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1076227946',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1076227946',id=17,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:16:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-zdlmcqkz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:16:16Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=94963e04-a73a-4d6e-8f87-59453794973a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.128 2 DEBUG nova.network.os_vif_util [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.128 2 DEBUG nova.network.os_vif_util [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:36:88,bridge_name='br-int',has_traffic_filtering=True,id=18d48c5d-b383-4b4b-9188-a8aac7e21179,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18d48c5d-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.129 2 DEBUG nova.virt.libvirt.migration [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Updating guest XML with vif config: <interface type="ethernet">
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <mac address="fa:16:3e:45:36:88"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <model type="virtio"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <mtu size="1442"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <target dev="tap18d48c5d-b3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]: </interface>
Oct 06 14:16:58 compute-0 nova_compute[192903]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.129 2 DEBUG nova.virt.libvirt.migration [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <name>instance-00000011</name>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <uuid>94963e04-a73a-4d6e-8f87-59453794973a</uuid>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-1076227946</nova:name>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:16:10</nova:creationTime>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:16:58 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:16:58 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:port uuid="18d48c5d-b383-4b4b-9188-a8aac7e21179">
Oct 06 14:16:58 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <memory unit="KiB">131072</memory>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <vcpu placement="static">1</vcpu>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <resource>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <partition>/machine</partition>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </resource>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <system>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="serial">94963e04-a73a-4d6e-8f87-59453794973a</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="uuid">94963e04-a73a-4d6e-8f87-59453794973a</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </system>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <os>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </os>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <features>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <vmcoreinfo state="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </features>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <cpu mode="host-model" check="partial">
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <on_poweroff>destroy</on_poweroff>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <on_reboot>restart</on_reboot>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <on_crash>destroy</on_crash>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.config"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <readonly/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="1" port="0x10"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="2" port="0x11"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="3" port="0x12"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="4" port="0x13"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="5" port="0x14"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="6" port="0x15"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="7" port="0x16"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="8" port="0x17"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="9" port="0x18"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="10" port="0x19"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="11" port="0x1a"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="12" port="0x1b"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="13" port="0x1c"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="14" port="0x1d"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="15" port="0x1e"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="16" port="0x1f"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="17" port="0x20"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="18" port="0x21"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="19" port="0x22"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="20" port="0x23"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="21" port="0x24"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="22" port="0x25"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="23" port="0x26"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="24" port="0x27"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="25" port="0x28"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-pci-bridge"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="sata" index="0">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <interface type="ethernet"><mac address="fa:16:3e:45:36:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap18d48c5d-b3"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </interface><serial type="pty">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/console.log" append="off"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target type="isa-serial" port="0">
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <model name="isa-serial"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </target>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <console type="pty">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/console.log" append="off"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target type="serial" port="0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </console>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="usb" bus="0" port="1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </input>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <input type="mouse" bus="ps2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <listen type="address" address="::"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </graphics>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <video>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model type="virtio" heads="1" primary="yes"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </video>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]: </domain>
Oct 06 14:16:58 compute-0 nova_compute[192903]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.131 2 DEBUG nova.virt.libvirt.migration [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <name>instance-00000011</name>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <uuid>94963e04-a73a-4d6e-8f87-59453794973a</uuid>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-1076227946</nova:name>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:16:10</nova:creationTime>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:16:58 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:16:58 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:port uuid="18d48c5d-b383-4b4b-9188-a8aac7e21179">
Oct 06 14:16:58 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <memory unit="KiB">131072</memory>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <vcpu placement="static">1</vcpu>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <resource>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <partition>/machine</partition>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </resource>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <system>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="serial">94963e04-a73a-4d6e-8f87-59453794973a</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="uuid">94963e04-a73a-4d6e-8f87-59453794973a</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </system>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <os>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </os>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <features>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <vmcoreinfo state="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </features>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <cpu mode="host-model" check="partial">
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <on_poweroff>destroy</on_poweroff>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <on_reboot>restart</on_reboot>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <on_crash>destroy</on_crash>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.config"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <readonly/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="1" port="0x10"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="2" port="0x11"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="3" port="0x12"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="4" port="0x13"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="5" port="0x14"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="6" port="0x15"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="7" port="0x16"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="8" port="0x17"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="9" port="0x18"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="10" port="0x19"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="11" port="0x1a"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="12" port="0x1b"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="13" port="0x1c"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="14" port="0x1d"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="15" port="0x1e"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="16" port="0x1f"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="17" port="0x20"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="18" port="0x21"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="19" port="0x22"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="20" port="0x23"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="21" port="0x24"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="22" port="0x25"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="23" port="0x26"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="24" port="0x27"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="25" port="0x28"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-pci-bridge"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="sata" index="0">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <interface type="ethernet"><mac address="fa:16:3e:45:36:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap18d48c5d-b3"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </interface><serial type="pty">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/console.log" append="off"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target type="isa-serial" port="0">
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <model name="isa-serial"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </target>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <console type="pty">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/console.log" append="off"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target type="serial" port="0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </console>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="usb" bus="0" port="1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </input>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <input type="mouse" bus="ps2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <listen type="address" address="::"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </graphics>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <video>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model type="virtio" heads="1" primary="yes"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </video>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]: </domain>
Oct 06 14:16:58 compute-0 nova_compute[192903]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.131 2 DEBUG nova.virt.libvirt.migration [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] _update_pci_xml output xml=<domain type="kvm">
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <name>instance-00000011</name>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <uuid>94963e04-a73a-4d6e-8f87-59453794973a</uuid>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-1076227946</nova:name>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:16:10</nova:creationTime>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:16:58 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:16:58 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <nova:port uuid="18d48c5d-b383-4b4b-9188-a8aac7e21179">
Oct 06 14:16:58 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <memory unit="KiB">131072</memory>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <vcpu placement="static">1</vcpu>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <resource>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <partition>/machine</partition>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </resource>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <system>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="serial">94963e04-a73a-4d6e-8f87-59453794973a</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="uuid">94963e04-a73a-4d6e-8f87-59453794973a</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </system>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <os>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </os>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <features>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <vmcoreinfo state="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </features>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <cpu mode="host-model" check="partial">
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <on_poweroff>destroy</on_poweroff>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <on_reboot>restart</on_reboot>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <on_crash>destroy</on_crash>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/disk.config"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <readonly/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="1" port="0x10"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="2" port="0x11"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="3" port="0x12"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="4" port="0x13"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="5" port="0x14"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="6" port="0x15"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="7" port="0x16"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="8" port="0x17"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="9" port="0x18"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="10" port="0x19"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="11" port="0x1a"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="12" port="0x1b"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="13" port="0x1c"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="14" port="0x1d"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="15" port="0x1e"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="16" port="0x1f"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="17" port="0x20"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="18" port="0x21"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="19" port="0x22"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="20" port="0x23"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="21" port="0x24"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="22" port="0x25"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="23" port="0x26"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="24" port="0x27"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target chassis="25" port="0x28"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model name="pcie-pci-bridge"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <controller type="sata" index="0">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <interface type="ethernet"><mac address="fa:16:3e:45:36:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap18d48c5d-b3"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </interface><serial type="pty">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/console.log" append="off"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target type="isa-serial" port="0">
Oct 06 14:16:58 compute-0 nova_compute[192903]:         <model name="isa-serial"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       </target>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <console type="pty">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a/console.log" append="off"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <target type="serial" port="0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </console>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="usb" bus="0" port="1"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </input>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <input type="mouse" bus="ps2"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <listen type="address" address="::"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </graphics>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <video>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <model type="virtio" heads="1" primary="yes"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </video>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:16:58 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:16:58 compute-0 nova_compute[192903]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 06 14:16:58 compute-0 nova_compute[192903]: </domain>
Oct 06 14:16:58 compute-0 nova_compute[192903]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.132 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.348 2 DEBUG oslo_concurrency.lockutils [req-0828b069-1fe5-46db-a7db-504b2e72d31c req-741a003a-0ab0-4453-91c9-44b8c139e7d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-94963e04-a73a-4d6e-8f87-59453794973a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.456 2 INFO nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Updating resource usage from migration 7f5fc136-7851-4fb5-8a7d-b39adadee57a
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.485 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Migration 7f5fc136-7851-4fb5-8a7d-b39adadee57a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.485 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.485 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:16:57 up  1:17,  0 user,  load average: 0.23, 0.32, 0.39\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_8f3f3b7d20fc4715811486da569fc0ab': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.523 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.626 2 DEBUG nova.virt.libvirt.migration [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 06 14:16:58 compute-0 nova_compute[192903]: 2025-10-06 14:16:58.626 2 INFO nova.virt.libvirt.migration [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 06 14:16:59 compute-0 nova_compute[192903]: 2025-10-06 14:16:59.030 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:16:59 compute-0 nova_compute[192903]: 2025-10-06 14:16:59.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:16:59 compute-0 nova_compute[192903]: 2025-10-06 14:16:59.540 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:16:59 compute-0 nova_compute[192903]: 2025-10-06 14:16:59.540 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:16:59 compute-0 nova_compute[192903]: 2025-10-06 14:16:59.644 2 INFO nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 06 14:16:59 compute-0 podman[203308]: time="2025-10-06T14:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:16:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:16:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Oct 06 14:17:00 compute-0 kernel: tap18d48c5d-b3 (unregistering): left promiscuous mode
Oct 06 14:17:00 compute-0 NetworkManager[52035]: <info>  [1759760220.1442] device (tap18d48c5d-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:00 compute-0 ovn_controller[95205]: 2025-10-06T14:17:00Z|00159|binding|INFO|Releasing lport 18d48c5d-b383-4b4b-9188-a8aac7e21179 from this chassis (sb_readonly=0)
Oct 06 14:17:00 compute-0 ovn_controller[95205]: 2025-10-06T14:17:00Z|00160|binding|INFO|Setting lport 18d48c5d-b383-4b4b-9188-a8aac7e21179 down in Southbound
Oct 06 14:17:00 compute-0 ovn_controller[95205]: 2025-10-06T14:17:00Z|00161|binding|INFO|Removing iface tap18d48c5d-b3 ovn-installed in OVS
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.161 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:36:88 10.100.0.9'], port_security=['fa:16:3e:45:36:88 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '7f5c9d61-0a9d-467d-89a7-11e43e674cfc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '94963e04-a73a-4d6e-8f87-59453794973a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=18d48c5d-b383-4b4b-9188-a8aac7e21179) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.163 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 18d48c5d-b383-4b4b-9188-a8aac7e21179 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.164 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.166 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8f67a6bf-1df8-4336-9869-e222a1c5bdb9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.167 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace which is not needed anymore
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:00 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Deactivated successfully.
Oct 06 14:17:00 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Consumed 14.865s CPU time.
Oct 06 14:17:00 compute-0 systemd-machined[152985]: Machine qemu-13-instance-00000011 terminated.
Oct 06 14:17:00 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[222632]: [NOTICE]   (222648) : haproxy version is 3.0.5-8e879a5
Oct 06 14:17:00 compute-0 podman[222881]: 2025-10-06 14:17:00.321479925 +0000 UTC m=+0.038562989 container kill 00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 06 14:17:00 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[222632]: [NOTICE]   (222648) : path to executable is /usr/sbin/haproxy
Oct 06 14:17:00 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[222632]: [WARNING]  (222648) : Exiting Master process...
Oct 06 14:17:00 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[222632]: [ALERT]    (222648) : Current worker (222656) exited with code 143 (Terminated)
Oct 06 14:17:00 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[222632]: [WARNING]  (222648) : All workers exited. Exiting... (0)
Oct 06 14:17:00 compute-0 systemd[1]: libpod-00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56.scope: Deactivated successfully.
Oct 06 14:17:00 compute-0 conmon[222632]: conmon 00cefc47be7a790c50d5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56.scope/container/memory.events
Oct 06 14:17:00 compute-0 podman[222896]: 2025-10-06 14:17:00.366918395 +0000 UTC m=+0.024255421 container died 00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.398 2 DEBUG nova.virt.libvirt.guest [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.399 2 INFO nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Migration operation has completed
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.399 2 INFO nova.compute.manager [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] _post_live_migration() is started..
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.402 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.403 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.403 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.411 2 WARNING neutronclient.v2_0.client [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.412 2 WARNING neutronclient.v2_0.client [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:17:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56-userdata-shm.mount: Deactivated successfully.
Oct 06 14:17:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-73b84f5670d1889864f76346dfcac32244d0fde16c6d45dfc7ecd904fd08186c-merged.mount: Deactivated successfully.
Oct 06 14:17:00 compute-0 podman[222896]: 2025-10-06 14:17:00.431697735 +0000 UTC m=+0.089034751 container cleanup 00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 06 14:17:00 compute-0 systemd[1]: libpod-conmon-00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56.scope: Deactivated successfully.
Oct 06 14:17:00 compute-0 podman[222899]: 2025-10-06 14:17:00.453029888 +0000 UTC m=+0.096841257 container remove 00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.459 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e6ca55fc-a6e4-4d19-af26-707d7d34d91f]: (4, ("Mon Oct  6 02:17:00 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56)\n00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56\nMon Oct  6 02:17:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56)\n00cefc47be7a790c50d5cd785eb2f1aefab8548fd338ece7493827e761585f56\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.462 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a4675964-8ac6-4d60-9a62-e2e1c1589713]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.462 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.463 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[838fa55f-ffd1-4c36-a7f7-5dbb8058f619]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.464 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:00 compute-0 kernel: tap55ccf1b2-d0: left promiscuous mode
Oct 06 14:17:00 compute-0 nova_compute[192903]: 2025-10-06 14:17:00.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.494 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[37524cd3-3269-490d-ab73-8b16da5c2633]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.521 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3c507d49-3ad4-4472-9675-2fc5803d97e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.523 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[10c069de-21f1-4f19-991a-aa97f9e2286d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.541 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5869c1c4-28f4-42c6-bab2-4da267740add]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463614, 'reachable_time': 36041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222948, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.543 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:17:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:00.543 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[81b5d18b-3da2-407d-87d5-7d8e12a5719a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:17:00 compute-0 systemd[1]: run-netns-ovnmeta\x2d55ccf1b2\x2dd24e\x2d4063\x2db15b\x2d60a65227d75e.mount: Deactivated successfully.
Oct 06 14:17:01 compute-0 nova_compute[192903]: 2025-10-06 14:17:01.203 2 DEBUG nova.compute.manager [req-4198fbd5-2b40-4902-b73d-5dbc7850a074 req-a62e65a9-5861-4487-a43a-96a3cd9dbc44 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-unplugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:17:01 compute-0 nova_compute[192903]: 2025-10-06 14:17:01.204 2 DEBUG oslo_concurrency.lockutils [req-4198fbd5-2b40-4902-b73d-5dbc7850a074 req-a62e65a9-5861-4487-a43a-96a3cd9dbc44 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:01 compute-0 nova_compute[192903]: 2025-10-06 14:17:01.205 2 DEBUG oslo_concurrency.lockutils [req-4198fbd5-2b40-4902-b73d-5dbc7850a074 req-a62e65a9-5861-4487-a43a-96a3cd9dbc44 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:01 compute-0 nova_compute[192903]: 2025-10-06 14:17:01.206 2 DEBUG oslo_concurrency.lockutils [req-4198fbd5-2b40-4902-b73d-5dbc7850a074 req-a62e65a9-5861-4487-a43a-96a3cd9dbc44 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:01 compute-0 nova_compute[192903]: 2025-10-06 14:17:01.206 2 DEBUG nova.compute.manager [req-4198fbd5-2b40-4902-b73d-5dbc7850a074 req-a62e65a9-5861-4487-a43a-96a3cd9dbc44 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] No waiting events found dispatching network-vif-unplugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:17:01 compute-0 nova_compute[192903]: 2025-10-06 14:17:01.207 2 DEBUG nova.compute.manager [req-4198fbd5-2b40-4902-b73d-5dbc7850a074 req-a62e65a9-5861-4487-a43a-96a3cd9dbc44 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-unplugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:17:01 compute-0 openstack_network_exporter[205500]: ERROR   14:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:17:01 compute-0 openstack_network_exporter[205500]: ERROR   14:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:17:01 compute-0 openstack_network_exporter[205500]: ERROR   14:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:17:01 compute-0 openstack_network_exporter[205500]: ERROR   14:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:17:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:17:01 compute-0 openstack_network_exporter[205500]: ERROR   14:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:17:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:17:01 compute-0 nova_compute[192903]: 2025-10-06 14:17:01.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.187 2 DEBUG nova.network.neutron [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Activated binding for port 18d48c5d-b383-4b4b-9188-a8aac7e21179 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.188 2 DEBUG nova.compute.manager [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.189 2 DEBUG nova.virt.libvirt.vif [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:15:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1076227946',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1076227946',id=17,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:16:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-zdlmcqkz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:16:39Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=94963e04-a73a-4d6e-8f87-59453794973a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.190 2 DEBUG nova.network.os_vif_util [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "address": "fa:16:3e:45:36:88", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18d48c5d-b3", "ovs_interfaceid": "18d48c5d-b383-4b4b-9188-a8aac7e21179", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.191 2 DEBUG nova.network.os_vif_util [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:36:88,bridge_name='br-int',has_traffic_filtering=True,id=18d48c5d-b383-4b4b-9188-a8aac7e21179,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18d48c5d-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.192 2 DEBUG os_vif [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:36:88,bridge_name='br-int',has_traffic_filtering=True,id=18d48c5d-b383-4b4b-9188-a8aac7e21179,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18d48c5d-b3') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.195 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18d48c5d-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.202 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3e3a8448-c1ca-405a-8cad-797181c0199d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.207 2 INFO os_vif [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:36:88,bridge_name='br-int',has_traffic_filtering=True,id=18d48c5d-b383-4b4b-9188-a8aac7e21179,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18d48c5d-b3')
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.208 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.208 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.208 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.209 2 DEBUG nova.compute.manager [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.210 2 INFO nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Deleting instance files /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a_del
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.211 2 INFO nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Deletion of /var/lib/nova/instances/94963e04-a73a-4d6e-8f87-59453794973a_del complete
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.280 2 DEBUG nova.compute.manager [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.280 2 DEBUG oslo_concurrency.lockutils [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.281 2 DEBUG oslo_concurrency.lockutils [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.281 2 DEBUG oslo_concurrency.lockutils [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.282 2 DEBUG nova.compute.manager [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] No waiting events found dispatching network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.282 2 WARNING nova.compute.manager [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received unexpected event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 for instance with vm_state active and task_state migrating.
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.282 2 DEBUG nova.compute.manager [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-unplugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.283 2 DEBUG oslo_concurrency.lockutils [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.283 2 DEBUG oslo_concurrency.lockutils [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.284 2 DEBUG oslo_concurrency.lockutils [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.284 2 DEBUG nova.compute.manager [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] No waiting events found dispatching network-vif-unplugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:17:03 compute-0 nova_compute[192903]: 2025-10-06 14:17:03.284 2 DEBUG nova.compute.manager [req-208eaaab-9b8a-49c9-b994-beebf32c3415 req-9526be39-5728-427f-bd68-cd6e1ebedcd2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-unplugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:17:04 compute-0 nova_compute[192903]: 2025-10-06 14:17:04.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:04.504 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:17:04 compute-0 nova_compute[192903]: 2025-10-06 14:17:04.537 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:17:04 compute-0 nova_compute[192903]: 2025-10-06 14:17:04.537 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.050 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.051 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.051 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.052 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.052 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.357 2 DEBUG nova.compute.manager [req-4ee15b9a-1876-48e2-97e0-4d370c19faf9 req-8e9cd1a5-d269-4d0e-89e2-18dfc7875301 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.357 2 DEBUG oslo_concurrency.lockutils [req-4ee15b9a-1876-48e2-97e0-4d370c19faf9 req-8e9cd1a5-d269-4d0e-89e2-18dfc7875301 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.358 2 DEBUG oslo_concurrency.lockutils [req-4ee15b9a-1876-48e2-97e0-4d370c19faf9 req-8e9cd1a5-d269-4d0e-89e2-18dfc7875301 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.358 2 DEBUG oslo_concurrency.lockutils [req-4ee15b9a-1876-48e2-97e0-4d370c19faf9 req-8e9cd1a5-d269-4d0e-89e2-18dfc7875301 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.358 2 DEBUG nova.compute.manager [req-4ee15b9a-1876-48e2-97e0-4d370c19faf9 req-8e9cd1a5-d269-4d0e-89e2-18dfc7875301 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] No waiting events found dispatching network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:17:05 compute-0 nova_compute[192903]: 2025-10-06 14:17:05.359 2 WARNING nova.compute.manager [req-4ee15b9a-1876-48e2-97e0-4d370c19faf9 req-8e9cd1a5-d269-4d0e-89e2-18dfc7875301 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Received unexpected event network-vif-plugged-18d48c5d-b383-4b4b-9188-a8aac7e21179 for instance with vm_state active and task_state migrating.
Oct 06 14:17:06 compute-0 podman[222951]: 2025-10-06 14:17:06.211215773 +0000 UTC m=+0.067310538 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 06 14:17:06 compute-0 podman[222950]: 2025-10-06 14:17:06.217652423 +0000 UTC m=+0.077227710 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:17:06 compute-0 podman[222952]: 2025-10-06 14:17:06.249121603 +0000 UTC m=+0.089675737 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:17:06 compute-0 podman[222949]: 2025-10-06 14:17:06.289368815 +0000 UTC m=+0.144702410 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 06 14:17:07 compute-0 nova_compute[192903]: 2025-10-06 14:17:07.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:17:08 compute-0 nova_compute[192903]: 2025-10-06 14:17:08.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:09 compute-0 nova_compute[192903]: 2025-10-06 14:17:09.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:11.380 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:11.380 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:11.380 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:13 compute-0 nova_compute[192903]: 2025-10-06 14:17:13.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:13 compute-0 nova_compute[192903]: 2025-10-06 14:17:13.762 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "94963e04-a73a-4d6e-8f87-59453794973a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:13 compute-0 nova_compute[192903]: 2025-10-06 14:17:13.763 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:13 compute-0 nova_compute[192903]: 2025-10-06 14:17:13.763 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "94963e04-a73a-4d6e-8f87-59453794973a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.278 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.278 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.279 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.279 2 DEBUG nova.compute.resource_tracker [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.494 2 WARNING nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.495 2 DEBUG oslo_concurrency.processutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.516 2 DEBUG oslo_concurrency.processutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.517 2 DEBUG nova.compute.resource_tracker [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=73.30210876464844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.518 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:14 compute-0 nova_compute[192903]: 2025-10-06 14:17:14.518 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:15 compute-0 nova_compute[192903]: 2025-10-06 14:17:15.543 2 DEBUG nova.compute.resource_tracker [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Migration for instance 94963e04-a73a-4d6e-8f87-59453794973a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.051 2 DEBUG nova.compute.resource_tracker [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.085 2 DEBUG nova.compute.resource_tracker [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Migration 7f5fc136-7851-4fb5-8a7d-b39adadee57a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.086 2 DEBUG nova.compute.resource_tracker [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.086 2 DEBUG nova.compute.resource_tracker [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:17:14 up  1:18,  0 user,  load average: 0.25, 0.32, 0.39\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.114 2 DEBUG nova.scheduler.client.report [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.132 2 DEBUG nova.scheduler.client.report [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.132 2 DEBUG nova.compute.provider_tree [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.147 2 DEBUG nova.scheduler.client.report [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.169 2 DEBUG nova.scheduler.client.report [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STATUS_DISABLED,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.204 2 DEBUG nova.compute.provider_tree [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:17:16 compute-0 nova_compute[192903]: 2025-10-06 14:17:16.710 2 DEBUG nova.scheduler.client.report [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:17:17 compute-0 podman[223039]: 2025-10-06 14:17:17.22095558 +0000 UTC m=+0.075292568 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 06 14:17:17 compute-0 nova_compute[192903]: 2025-10-06 14:17:17.222 2 DEBUG nova.compute.resource_tracker [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:17:17 compute-0 nova_compute[192903]: 2025-10-06 14:17:17.222 2 DEBUG oslo_concurrency.lockutils [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.704s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:17 compute-0 nova_compute[192903]: 2025-10-06 14:17:17.246 2 INFO nova.compute.manager [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 06 14:17:18 compute-0 nova_compute[192903]: 2025-10-06 14:17:18.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:18 compute-0 nova_compute[192903]: 2025-10-06 14:17:18.381 2 INFO nova.scheduler.client.report [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Deleted allocation for migration 7f5fc136-7851-4fb5-8a7d-b39adadee57a
Oct 06 14:17:18 compute-0 nova_compute[192903]: 2025-10-06 14:17:18.382 2 DEBUG nova.virt.libvirt.driver [None req-867bc1f4-1080-48b6-82ca-3214611c4fb9 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 94963e04-a73a-4d6e-8f87-59453794973a] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Oct 06 14:17:19 compute-0 nova_compute[192903]: 2025-10-06 14:17:19.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:21 compute-0 podman[223059]: 2025-10-06 14:17:21.247511391 +0000 UTC m=+0.100934616 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Oct 06 14:17:23 compute-0 nova_compute[192903]: 2025-10-06 14:17:23.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:24 compute-0 nova_compute[192903]: 2025-10-06 14:17:24.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:28 compute-0 nova_compute[192903]: 2025-10-06 14:17:28.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:28 compute-0 nova_compute[192903]: 2025-10-06 14:17:28.967 2 DEBUG nova.compute.manager [None req-95be58cc-b310-4067-81da-16ca23984fbf 6fa7b295cc3748d282cf3095cde65304 fd142f68afa1489aa76784748e93db34 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Oct 06 14:17:29 compute-0 nova_compute[192903]: 2025-10-06 14:17:29.061 2 DEBUG nova.compute.provider_tree [None req-95be58cc-b310-4067-81da-16ca23984fbf 6fa7b295cc3748d282cf3095cde65304 fd142f68afa1489aa76784748e93db34 - - default default] Updating resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 generation from 18 to 20 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 06 14:17:29 compute-0 nova_compute[192903]: 2025-10-06 14:17:29.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:29 compute-0 podman[203308]: time="2025-10-06T14:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:17:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:17:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 06 14:17:31 compute-0 openstack_network_exporter[205500]: ERROR   14:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:17:31 compute-0 openstack_network_exporter[205500]: ERROR   14:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:17:31 compute-0 openstack_network_exporter[205500]: ERROR   14:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:17:31 compute-0 openstack_network_exporter[205500]: ERROR   14:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:17:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:17:31 compute-0 openstack_network_exporter[205500]: ERROR   14:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:17:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:17:33 compute-0 nova_compute[192903]: 2025-10-06 14:17:33.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:34 compute-0 nova_compute[192903]: 2025-10-06 14:17:34.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:37 compute-0 podman[223083]: 2025-10-06 14:17:37.201004603 +0000 UTC m=+0.062112953 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible)
Oct 06 14:17:37 compute-0 podman[223084]: 2025-10-06 14:17:37.220021042 +0000 UTC m=+0.076755337 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 06 14:17:37 compute-0 podman[223085]: 2025-10-06 14:17:37.229716947 +0000 UTC m=+0.071313044 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:17:37 compute-0 podman[223082]: 2025-10-06 14:17:37.261196043 +0000 UTC m=+0.123332980 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20250930)
Oct 06 14:17:38 compute-0 nova_compute[192903]: 2025-10-06 14:17:38.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:39 compute-0 nova_compute[192903]: 2025-10-06 14:17:39.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:43 compute-0 nova_compute[192903]: 2025-10-06 14:17:43.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:44 compute-0 nova_compute[192903]: 2025-10-06 14:17:44.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:48 compute-0 podman[223169]: 2025-10-06 14:17:48.219243369 +0000 UTC m=+0.077104186 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 06 14:17:48 compute-0 nova_compute[192903]: 2025-10-06 14:17:48.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:49 compute-0 nova_compute[192903]: 2025-10-06 14:17:49.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:52 compute-0 podman[223189]: 2025-10-06 14:17:52.201702928 +0000 UTC m=+0.064263748 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64)
Oct 06 14:17:53 compute-0 nova_compute[192903]: 2025-10-06 14:17:53.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:54 compute-0 nova_compute[192903]: 2025-10-06 14:17:54.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:54 compute-0 nova_compute[192903]: 2025-10-06 14:17:54.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:17:55 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:55.262 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:17:55 compute-0 nova_compute[192903]: 2025-10-06 14:17:55.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:55 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:17:55.263 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:17:55 compute-0 nova_compute[192903]: 2025-10-06 14:17:55.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.094 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.288 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.289 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.306 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.306 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=73.30210876464844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.307 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:17:56 compute-0 nova_compute[192903]: 2025-10-06 14:17:56.307 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:17:57 compute-0 nova_compute[192903]: 2025-10-06 14:17:57.351 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:17:57 compute-0 nova_compute[192903]: 2025-10-06 14:17:57.351 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:17:56 up  1:18,  0 user,  load average: 0.12, 0.28, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:17:57 compute-0 nova_compute[192903]: 2025-10-06 14:17:57.372 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:17:57 compute-0 nova_compute[192903]: 2025-10-06 14:17:57.878 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:17:58 compute-0 nova_compute[192903]: 2025-10-06 14:17:58.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:58 compute-0 nova_compute[192903]: 2025-10-06 14:17:58.388 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:17:58 compute-0 nova_compute[192903]: 2025-10-06 14:17:58.389 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.082s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:17:59 compute-0 nova_compute[192903]: 2025-10-06 14:17:59.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:17:59 compute-0 podman[203308]: time="2025-10-06T14:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:17:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:17:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 06 14:18:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:00.265 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:00 compute-0 nova_compute[192903]: 2025-10-06 14:18:00.666 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "1f122883-4f1c-41b8-859c-862157a7bb48" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:00 compute-0 nova_compute[192903]: 2025-10-06 14:18:00.666 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:01 compute-0 nova_compute[192903]: 2025-10-06 14:18:01.172 2 DEBUG nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:18:01 compute-0 openstack_network_exporter[205500]: ERROR   14:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:18:01 compute-0 openstack_network_exporter[205500]: ERROR   14:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:18:01 compute-0 openstack_network_exporter[205500]: ERROR   14:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:18:01 compute-0 openstack_network_exporter[205500]: ERROR   14:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:18:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:18:01 compute-0 openstack_network_exporter[205500]: ERROR   14:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:18:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:18:01 compute-0 nova_compute[192903]: 2025-10-06 14:18:01.873 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:01 compute-0 nova_compute[192903]: 2025-10-06 14:18:01.873 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:01 compute-0 nova_compute[192903]: 2025-10-06 14:18:01.957 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:18:01 compute-0 nova_compute[192903]: 2025-10-06 14:18:01.958 2 INFO nova.compute.claims [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:18:03 compute-0 nova_compute[192903]: 2025-10-06 14:18:03.014 2 DEBUG nova.compute.provider_tree [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:18:03 compute-0 nova_compute[192903]: 2025-10-06 14:18:03.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:03 compute-0 nova_compute[192903]: 2025-10-06 14:18:03.389 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:18:03 compute-0 nova_compute[192903]: 2025-10-06 14:18:03.390 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:18:03 compute-0 nova_compute[192903]: 2025-10-06 14:18:03.390 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:18:03 compute-0 nova_compute[192903]: 2025-10-06 14:18:03.390 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:18:03 compute-0 nova_compute[192903]: 2025-10-06 14:18:03.391 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:18:03 compute-0 nova_compute[192903]: 2025-10-06 14:18:03.523 2 DEBUG nova.scheduler.client.report [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:18:04 compute-0 nova_compute[192903]: 2025-10-06 14:18:04.035 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.162s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:04 compute-0 nova_compute[192903]: 2025-10-06 14:18:04.037 2 DEBUG nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:18:04 compute-0 nova_compute[192903]: 2025-10-06 14:18:04.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:04 compute-0 nova_compute[192903]: 2025-10-06 14:18:04.555 2 DEBUG nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:18:04 compute-0 nova_compute[192903]: 2025-10-06 14:18:04.555 2 DEBUG nova.network.neutron [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:18:04 compute-0 nova_compute[192903]: 2025-10-06 14:18:04.556 2 WARNING neutronclient.v2_0.client [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:04 compute-0 nova_compute[192903]: 2025-10-06 14:18:04.557 2 WARNING neutronclient.v2_0.client [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:05 compute-0 nova_compute[192903]: 2025-10-06 14:18:05.067 2 INFO nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:18:05 compute-0 nova_compute[192903]: 2025-10-06 14:18:05.148 2 DEBUG nova.network.neutron [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Successfully created port: 361eb8b6-322f-4593-8494-22ca046eaab3 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:18:05 compute-0 nova_compute[192903]: 2025-10-06 14:18:05.577 2 DEBUG nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.601 2 DEBUG nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.602 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.603 2 INFO nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Creating image(s)
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.603 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "/var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.604 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "/var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.604 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "/var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.605 2 DEBUG oslo_utils.imageutils.format_inspector [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.609 2 DEBUG oslo_utils.imageutils.format_inspector [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.611 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.627 2 DEBUG nova.network.neutron [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Successfully updated port: 361eb8b6-322f-4593-8494-22ca046eaab3 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.693 2 DEBUG nova.compute.manager [req-2511812f-39f3-4745-a526-c9000b30bb1a req-27ea5fc2-a5e1-4fd2-bdb4-8659af2a0bae e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Received event network-changed-361eb8b6-322f-4593-8494-22ca046eaab3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.694 2 DEBUG nova.compute.manager [req-2511812f-39f3-4745-a526-c9000b30bb1a req-27ea5fc2-a5e1-4fd2-bdb4-8659af2a0bae e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Refreshing instance network info cache due to event network-changed-361eb8b6-322f-4593-8494-22ca046eaab3. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.694 2 DEBUG oslo_concurrency.lockutils [req-2511812f-39f3-4745-a526-c9000b30bb1a req-27ea5fc2-a5e1-4fd2-bdb4-8659af2a0bae e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-1f122883-4f1c-41b8-859c-862157a7bb48" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.695 2 DEBUG oslo_concurrency.lockutils [req-2511812f-39f3-4745-a526-c9000b30bb1a req-27ea5fc2-a5e1-4fd2-bdb4-8659af2a0bae e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-1f122883-4f1c-41b8-859c-862157a7bb48" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.695 2 DEBUG nova.network.neutron [req-2511812f-39f3-4745-a526-c9000b30bb1a req-27ea5fc2-a5e1-4fd2-bdb4-8659af2a0bae e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Refreshing network info cache for port 361eb8b6-322f-4593-8494-22ca046eaab3 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.705 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.706 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.707 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.708 2 DEBUG oslo_utils.imageutils.format_inspector [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.715 2 DEBUG oslo_utils.imageutils.format_inspector [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.715 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.778 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.779 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.831 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.833 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.833 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.901 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.903 2 DEBUG nova.virt.disk.api [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Checking if we can resize image /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.904 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.993 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.995 2 DEBUG nova.virt.disk.api [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Cannot resize image /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.995 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.996 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Ensure instance console log exists: /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.997 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.997 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:06 compute-0 nova_compute[192903]: 2025-10-06 14:18:06.998 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:07 compute-0 nova_compute[192903]: 2025-10-06 14:18:07.138 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "refresh_cache-1f122883-4f1c-41b8-859c-862157a7bb48" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:18:07 compute-0 nova_compute[192903]: 2025-10-06 14:18:07.203 2 WARNING neutronclient.v2_0.client [req-2511812f-39f3-4745-a526-c9000b30bb1a req-27ea5fc2-a5e1-4fd2-bdb4-8659af2a0bae e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:07 compute-0 nova_compute[192903]: 2025-10-06 14:18:07.305 2 DEBUG nova.network.neutron [req-2511812f-39f3-4745-a526-c9000b30bb1a req-27ea5fc2-a5e1-4fd2-bdb4-8659af2a0bae e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:18:07 compute-0 nova_compute[192903]: 2025-10-06 14:18:07.506 2 DEBUG nova.network.neutron [req-2511812f-39f3-4745-a526-c9000b30bb1a req-27ea5fc2-a5e1-4fd2-bdb4-8659af2a0bae e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:18:08 compute-0 nova_compute[192903]: 2025-10-06 14:18:08.015 2 DEBUG oslo_concurrency.lockutils [req-2511812f-39f3-4745-a526-c9000b30bb1a req-27ea5fc2-a5e1-4fd2-bdb4-8659af2a0bae e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-1f122883-4f1c-41b8-859c-862157a7bb48" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:18:08 compute-0 nova_compute[192903]: 2025-10-06 14:18:08.016 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquired lock "refresh_cache-1f122883-4f1c-41b8-859c-862157a7bb48" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:18:08 compute-0 nova_compute[192903]: 2025-10-06 14:18:08.016 2 DEBUG nova.network.neutron [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:18:08 compute-0 podman[223235]: 2025-10-06 14:18:08.238250367 +0000 UTC m=+0.073053799 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:18:08 compute-0 podman[223229]: 2025-10-06 14:18:08.24177756 +0000 UTC m=+0.073432659 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true)
Oct 06 14:18:08 compute-0 nova_compute[192903]: 2025-10-06 14:18:08.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:08 compute-0 podman[223227]: 2025-10-06 14:18:08.26956078 +0000 UTC m=+0.123765251 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 06 14:18:08 compute-0 podman[223228]: 2025-10-06 14:18:08.284236155 +0000 UTC m=+0.125340603 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4)
Oct 06 14:18:08 compute-0 nova_compute[192903]: 2025-10-06 14:18:08.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:18:09 compute-0 nova_compute[192903]: 2025-10-06 14:18:09.087 2 DEBUG nova.network.neutron [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:18:09 compute-0 nova_compute[192903]: 2025-10-06 14:18:09.332 2 WARNING neutronclient.v2_0.client [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:09 compute-0 nova_compute[192903]: 2025-10-06 14:18:09.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:09 compute-0 nova_compute[192903]: 2025-10-06 14:18:09.494 2 DEBUG nova.network.neutron [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Updating instance_info_cache with network_info: [{"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.003 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Releasing lock "refresh_cache-1f122883-4f1c-41b8-859c-862157a7bb48" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.004 2 DEBUG nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Instance network_info: |[{"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.007 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Start _get_guest_xml network_info=[{"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.012 2 WARNING nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.013 2 DEBUG nova.virt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-2116639016', uuid='1f122883-4f1c-41b8-859c-862157a7bb48'), owner=OwnerMeta(userid='98ee6da236ba42baa0fef11dcb52cbdd', username='tempest-TestExecuteStrategies-1255317741-project-admin', projectid='8f3f3b7d20fc4715811486da569fc0ab', projectname='tempest-TestExecuteStrategies-1255317741'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759760290.0137484) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.018 2 DEBUG nova.virt.libvirt.host [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.019 2 DEBUG nova.virt.libvirt.host [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.022 2 DEBUG nova.virt.libvirt.host [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.023 2 DEBUG nova.virt.libvirt.host [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.023 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.024 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.024 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.025 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.025 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.025 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.025 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.026 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.026 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.026 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.026 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.027 2 DEBUG nova.virt.hardware [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.031 2 DEBUG nova.virt.libvirt.vif [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:17:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2116639016',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2116639016',id=19,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-a95ab0ai',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:18:05Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=1f122883-4f1c-41b8-859c-862157a7bb48,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.031 2 DEBUG nova.network.os_vif_util [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.032 2 DEBUG nova.network.os_vif_util [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:b1:40,bridge_name='br-int',has_traffic_filtering=True,id=361eb8b6-322f-4593-8494-22ca046eaab3,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap361eb8b6-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.033 2 DEBUG nova.objects.instance [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f122883-4f1c-41b8-859c-862157a7bb48 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.542 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <uuid>1f122883-4f1c-41b8-859c-862157a7bb48</uuid>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <name>instance-00000013</name>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-2116639016</nova:name>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:18:10</nova:creationTime>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:18:10 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:18:10 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         <nova:port uuid="361eb8b6-322f-4593-8494-22ca046eaab3">
Oct 06 14:18:10 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <system>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <entry name="serial">1f122883-4f1c-41b8-859c-862157a7bb48</entry>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <entry name="uuid">1f122883-4f1c-41b8-859c-862157a7bb48</entry>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     </system>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <os>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   </os>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <features>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   </features>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk.config"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:3c:b1:40"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <target dev="tap361eb8b6-32"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/console.log" append="off"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <video>
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     </video>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:18:10 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:18:10 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:18:10 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:18:10 compute-0 nova_compute[192903]: </domain>
Oct 06 14:18:10 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.543 2 DEBUG nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Preparing to wait for external event network-vif-plugged-361eb8b6-322f-4593-8494-22ca046eaab3 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.544 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.544 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.544 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.545 2 DEBUG nova.virt.libvirt.vif [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:17:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2116639016',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2116639016',id=19,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-a95ab0ai',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:18:05Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=1f122883-4f1c-41b8-859c-862157a7bb48,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.546 2 DEBUG nova.network.os_vif_util [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.547 2 DEBUG nova.network.os_vif_util [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:b1:40,bridge_name='br-int',has_traffic_filtering=True,id=361eb8b6-322f-4593-8494-22ca046eaab3,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap361eb8b6-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.547 2 DEBUG os_vif [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:b1:40,bridge_name='br-int',has_traffic_filtering=True,id=361eb8b6-322f-4593-8494-22ca046eaab3,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap361eb8b6-32') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.549 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.550 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c0067b13-717b-5a4e-a14f-80b2cd55732c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.559 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap361eb8b6-32, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.560 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap361eb8b6-32, col_values=(('qos', UUID('39025d5d-a3ad-4550-8cd2-46d368106d85')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.560 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap361eb8b6-32, col_values=(('external_ids', {'iface-id': '361eb8b6-322f-4593-8494-22ca046eaab3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3c:b1:40', 'vm-uuid': '1f122883-4f1c-41b8-859c-862157a7bb48'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:10 compute-0 NetworkManager[52035]: <info>  [1759760290.5636] manager: (tap361eb8b6-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:10 compute-0 nova_compute[192903]: 2025-10-06 14:18:10.572 2 INFO os_vif [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:b1:40,bridge_name='br-int',has_traffic_filtering=True,id=361eb8b6-322f-4593-8494-22ca046eaab3,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap361eb8b6-32')
Oct 06 14:18:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:11.381 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:11.382 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:11.382 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:12 compute-0 nova_compute[192903]: 2025-10-06 14:18:12.109 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:18:12 compute-0 nova_compute[192903]: 2025-10-06 14:18:12.110 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:18:12 compute-0 nova_compute[192903]: 2025-10-06 14:18:12.110 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No VIF found with MAC fa:16:3e:3c:b1:40, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:18:12 compute-0 nova_compute[192903]: 2025-10-06 14:18:12.111 2 INFO nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Using config drive
Oct 06 14:18:12 compute-0 nova_compute[192903]: 2025-10-06 14:18:12.623 2 WARNING neutronclient.v2_0.client [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:13 compute-0 nova_compute[192903]: 2025-10-06 14:18:13.231 2 INFO nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Creating config drive at /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk.config
Oct 06 14:18:13 compute-0 nova_compute[192903]: 2025-10-06 14:18:13.237 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmphdka0ru4 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:13 compute-0 nova_compute[192903]: 2025-10-06 14:18:13.364 2 DEBUG oslo_concurrency.processutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmphdka0ru4" returned: 0 in 0.127s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:13 compute-0 kernel: tap361eb8b6-32: entered promiscuous mode
Oct 06 14:18:13 compute-0 ovn_controller[95205]: 2025-10-06T14:18:13Z|00162|binding|INFO|Claiming lport 361eb8b6-322f-4593-8494-22ca046eaab3 for this chassis.
Oct 06 14:18:13 compute-0 ovn_controller[95205]: 2025-10-06T14:18:13Z|00163|binding|INFO|361eb8b6-322f-4593-8494-22ca046eaab3: Claiming fa:16:3e:3c:b1:40 10.100.0.14
Oct 06 14:18:13 compute-0 NetworkManager[52035]: <info>  [1759760293.4614] manager: (tap361eb8b6-32): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Oct 06 14:18:13 compute-0 nova_compute[192903]: 2025-10-06 14:18:13.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.471 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:b1:40 10.100.0.14'], port_security=['fa:16:3e:3c:b1:40 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1f122883-4f1c-41b8-859c-862157a7bb48', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=361eb8b6-322f-4593-8494-22ca046eaab3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.473 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 361eb8b6-322f-4593-8494-22ca046eaab3 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e bound to our chassis
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.475 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:18:13 compute-0 ovn_controller[95205]: 2025-10-06T14:18:13Z|00164|binding|INFO|Setting lport 361eb8b6-322f-4593-8494-22ca046eaab3 ovn-installed in OVS
Oct 06 14:18:13 compute-0 ovn_controller[95205]: 2025-10-06T14:18:13Z|00165|binding|INFO|Setting lport 361eb8b6-322f-4593-8494-22ca046eaab3 up in Southbound
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.494 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[264a1cf7-fff1-4197-97e2-3594f407854e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 nova_compute[192903]: 2025-10-06 14:18:13.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.495 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55ccf1b2-d1 in ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.497 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55ccf1b2-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.498 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8d086121-9a23-4efb-a296-bb3c817f30cb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.499 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7d795206-ce3d-49b0-bacb-cd52ca32356f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 systemd-udevd[223327]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:18:13 compute-0 NetworkManager[52035]: <info>  [1759760293.5159] device (tap361eb8b6-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:18:13 compute-0 NetworkManager[52035]: <info>  [1759760293.5180] device (tap361eb8b6-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.518 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[6e41a421-7cbc-4e81-a2c0-e00500465069]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 systemd-machined[152985]: New machine qemu-14-instance-00000013.
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.526 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6c3876d7-2ffe-4e1c-a361-8d97e95f7087]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000013.
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.558 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e09a28-6000-4fda-9aa5-610e0b074e68]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 NetworkManager[52035]: <info>  [1759760293.5632] manager: (tap55ccf1b2-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.563 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2db32835-6276-499c-99a9-93b19fc70f73]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.591 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[48c3c25a-053d-4c7a-913d-c2b4955daff0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.594 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[8a9be871-b2f9-49e1-9f7d-2ff7df52d401]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 NetworkManager[52035]: <info>  [1759760293.6165] device (tap55ccf1b2-d0): carrier: link connected
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.620 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[58d0aae2-1f3a-4120-b5e4-dd655427aed0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.635 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[446a7dac-344c-4d51-8f1c-0d5896b2cf0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475525, 'reachable_time': 33628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223361, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.655 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[aec3a366-17f5-4121-87b0-ca8a1db38bd5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:aab9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475525, 'tstamp': 475525}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223362, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.674 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ff4a09-da60-4b26-91e4-4e561c030c7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475525, 'reachable_time': 33628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223363, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.716 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[01a3f346-1c9f-4b54-b55b-5230e28b1978]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.805 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a02d58-1216-4dfd-85a4-e3069f6aad60]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.806 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.807 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.807 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:13 compute-0 NetworkManager[52035]: <info>  [1759760293.8092] manager: (tap55ccf1b2-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Oct 06 14:18:13 compute-0 kernel: tap55ccf1b2-d0: entered promiscuous mode
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.814 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:13 compute-0 ovn_controller[95205]: 2025-10-06T14:18:13Z|00166|binding|INFO|Releasing lport 0ee47753-a40c-4a21-a6ed-65093b6727d9 from this chassis (sb_readonly=0)
Oct 06 14:18:13 compute-0 nova_compute[192903]: 2025-10-06 14:18:13.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.828 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab33b39-ca1f-4a3a-a330-d6dd2a1a93dd]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.828 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.828 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.829 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 55ccf1b2-d24e-4063-b15b-60a65227d75e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.829 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.845 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[714aa672-eaba-4b39-aa97-c2717e8db459]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.845 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.846 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ea5da699-d36c-4797-b21a-8a4be285e5c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.847 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:18:13 compute-0 nova_compute[192903]: 2025-10-06 14:18:13.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:13 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:13.847 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'env', 'PROCESS_TAG=haproxy-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55ccf1b2-d24e-4063-b15b-60a65227d75e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.269 2 DEBUG nova.compute.manager [req-57259580-0bea-4cf9-84f5-750d217ea9fe req-3e5e9e6d-48d1-4214-b6c0-c3c277bbdf98 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Received event network-vif-plugged-361eb8b6-322f-4593-8494-22ca046eaab3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.270 2 DEBUG oslo_concurrency.lockutils [req-57259580-0bea-4cf9-84f5-750d217ea9fe req-3e5e9e6d-48d1-4214-b6c0-c3c277bbdf98 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.270 2 DEBUG oslo_concurrency.lockutils [req-57259580-0bea-4cf9-84f5-750d217ea9fe req-3e5e9e6d-48d1-4214-b6c0-c3c277bbdf98 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.270 2 DEBUG oslo_concurrency.lockutils [req-57259580-0bea-4cf9-84f5-750d217ea9fe req-3e5e9e6d-48d1-4214-b6c0-c3c277bbdf98 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.271 2 DEBUG nova.compute.manager [req-57259580-0bea-4cf9-84f5-750d217ea9fe req-3e5e9e6d-48d1-4214-b6c0-c3c277bbdf98 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Processing event network-vif-plugged-361eb8b6-322f-4593-8494-22ca046eaab3 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:18:14 compute-0 podman[223402]: 2025-10-06 14:18:14.28635926 +0000 UTC m=+0.063840087 container create 6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:18:14 compute-0 systemd[1]: Started libpod-conmon-6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73.scope.
Oct 06 14:18:14 compute-0 podman[223402]: 2025-10-06 14:18:14.252040939 +0000 UTC m=+0.029521856 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:18:14 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5ae46bd9c6826ebf8b83eb53717f4cdeb109437659b56bb1745844731b3c22c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:18:14 compute-0 podman[223402]: 2025-10-06 14:18:14.375990524 +0000 UTC m=+0.153471371 container init 6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 06 14:18:14 compute-0 podman[223402]: 2025-10-06 14:18:14.382442323 +0000 UTC m=+0.159923150 container start 6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:18:14 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[223417]: [NOTICE]   (223421) : New worker (223423) forked
Oct 06 14:18:14 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[223417]: [NOTICE]   (223421) : Loading success.
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.661 2 DEBUG nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.664 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.667 2 INFO nova.virt.libvirt.driver [-] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Instance spawned successfully.
Oct 06 14:18:14 compute-0 nova_compute[192903]: 2025-10-06 14:18:14.667 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:18:15 compute-0 nova_compute[192903]: 2025-10-06 14:18:15.185 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:18:15 compute-0 nova_compute[192903]: 2025-10-06 14:18:15.186 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:18:15 compute-0 nova_compute[192903]: 2025-10-06 14:18:15.187 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:18:15 compute-0 nova_compute[192903]: 2025-10-06 14:18:15.187 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:18:15 compute-0 nova_compute[192903]: 2025-10-06 14:18:15.188 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:18:15 compute-0 nova_compute[192903]: 2025-10-06 14:18:15.189 2 DEBUG nova.virt.libvirt.driver [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:18:15 compute-0 nova_compute[192903]: 2025-10-06 14:18:15.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:15 compute-0 nova_compute[192903]: 2025-10-06 14:18:15.702 2 INFO nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Took 9.10 seconds to spawn the instance on the hypervisor.
Oct 06 14:18:15 compute-0 nova_compute[192903]: 2025-10-06 14:18:15.704 2 DEBUG nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:18:16 compute-0 nova_compute[192903]: 2025-10-06 14:18:16.240 2 INFO nova.compute.manager [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Took 14.56 seconds to build instance.
Oct 06 14:18:16 compute-0 nova_compute[192903]: 2025-10-06 14:18:16.336 2 DEBUG nova.compute.manager [req-d8075736-b770-4d23-a59e-13d77fb97f1e req-c5c5d8ae-1bd8-44db-9772-717bd7b7ea86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Received event network-vif-plugged-361eb8b6-322f-4593-8494-22ca046eaab3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:18:16 compute-0 nova_compute[192903]: 2025-10-06 14:18:16.337 2 DEBUG oslo_concurrency.lockutils [req-d8075736-b770-4d23-a59e-13d77fb97f1e req-c5c5d8ae-1bd8-44db-9772-717bd7b7ea86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:16 compute-0 nova_compute[192903]: 2025-10-06 14:18:16.338 2 DEBUG oslo_concurrency.lockutils [req-d8075736-b770-4d23-a59e-13d77fb97f1e req-c5c5d8ae-1bd8-44db-9772-717bd7b7ea86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:16 compute-0 nova_compute[192903]: 2025-10-06 14:18:16.339 2 DEBUG oslo_concurrency.lockutils [req-d8075736-b770-4d23-a59e-13d77fb97f1e req-c5c5d8ae-1bd8-44db-9772-717bd7b7ea86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:16 compute-0 nova_compute[192903]: 2025-10-06 14:18:16.339 2 DEBUG nova.compute.manager [req-d8075736-b770-4d23-a59e-13d77fb97f1e req-c5c5d8ae-1bd8-44db-9772-717bd7b7ea86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] No waiting events found dispatching network-vif-plugged-361eb8b6-322f-4593-8494-22ca046eaab3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:18:16 compute-0 nova_compute[192903]: 2025-10-06 14:18:16.340 2 WARNING nova.compute.manager [req-d8075736-b770-4d23-a59e-13d77fb97f1e req-c5c5d8ae-1bd8-44db-9772-717bd7b7ea86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Received unexpected event network-vif-plugged-361eb8b6-322f-4593-8494-22ca046eaab3 for instance with vm_state active and task_state None.
Oct 06 14:18:16 compute-0 nova_compute[192903]: 2025-10-06 14:18:16.746 2 DEBUG oslo_concurrency.lockutils [None req-510bffcf-d307-4b73-b556-67f5bef4fb59 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.079s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:19 compute-0 podman[223432]: 2025-10-06 14:18:19.240894496 +0000 UTC m=+0.091227487 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 06 14:18:19 compute-0 nova_compute[192903]: 2025-10-06 14:18:19.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:20 compute-0 nova_compute[192903]: 2025-10-06 14:18:20.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:23 compute-0 podman[223452]: 2025-10-06 14:18:23.227878744 +0000 UTC m=+0.080061213 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Oct 06 14:18:24 compute-0 nova_compute[192903]: 2025-10-06 14:18:24.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:25 compute-0 nova_compute[192903]: 2025-10-06 14:18:25.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:27 compute-0 ovn_controller[95205]: 2025-10-06T14:18:27Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3c:b1:40 10.100.0.14
Oct 06 14:18:27 compute-0 ovn_controller[95205]: 2025-10-06T14:18:27Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3c:b1:40 10.100.0.14
Oct 06 14:18:29 compute-0 nova_compute[192903]: 2025-10-06 14:18:29.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:29 compute-0 podman[203308]: time="2025-10-06T14:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:18:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20763 "" "Go-http-client/1.1"
Oct 06 14:18:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Oct 06 14:18:30 compute-0 nova_compute[192903]: 2025-10-06 14:18:30.190 2 DEBUG nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Creating tmpfile /var/lib/nova/instances/tmpsm55j4o6 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:18:30 compute-0 nova_compute[192903]: 2025-10-06 14:18:30.191 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:30 compute-0 nova_compute[192903]: 2025-10-06 14:18:30.288 2 DEBUG nova.compute.manager [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsm55j4o6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:18:30 compute-0 nova_compute[192903]: 2025-10-06 14:18:30.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:31 compute-0 openstack_network_exporter[205500]: ERROR   14:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:18:31 compute-0 openstack_network_exporter[205500]: ERROR   14:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:18:31 compute-0 openstack_network_exporter[205500]: ERROR   14:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:18:31 compute-0 openstack_network_exporter[205500]: ERROR   14:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:18:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:18:31 compute-0 openstack_network_exporter[205500]: ERROR   14:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:18:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:18:32 compute-0 nova_compute[192903]: 2025-10-06 14:18:32.324 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:34 compute-0 nova_compute[192903]: 2025-10-06 14:18:34.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:35 compute-0 nova_compute[192903]: 2025-10-06 14:18:35.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:36 compute-0 nova_compute[192903]: 2025-10-06 14:18:36.041 2 DEBUG nova.compute.manager [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsm55j4o6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e3fc8d91-13d4-4f62-9b6a-526a7a22e155',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:18:37 compute-0 nova_compute[192903]: 2025-10-06 14:18:37.058 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-e3fc8d91-13d4-4f62-9b6a-526a7a22e155" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:18:37 compute-0 nova_compute[192903]: 2025-10-06 14:18:37.059 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-e3fc8d91-13d4-4f62-9b6a-526a7a22e155" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:18:37 compute-0 nova_compute[192903]: 2025-10-06 14:18:37.059 2 DEBUG nova.network.neutron [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:18:37 compute-0 nova_compute[192903]: 2025-10-06 14:18:37.566 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:37 compute-0 nova_compute[192903]: 2025-10-06 14:18:37.908 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:38 compute-0 nova_compute[192903]: 2025-10-06 14:18:38.030 2 DEBUG nova.network.neutron [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Updating instance_info_cache with network_info: [{"id": "d82b1892-0a97-4309-a337-b9f68f727ea7", "address": "fa:16:3e:e3:07:0e", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd82b1892-0a", "ovs_interfaceid": "d82b1892-0a97-4309-a337-b9f68f727ea7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:18:38 compute-0 nova_compute[192903]: 2025-10-06 14:18:38.538 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-e3fc8d91-13d4-4f62-9b6a-526a7a22e155" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:18:38 compute-0 nova_compute[192903]: 2025-10-06 14:18:38.552 2 DEBUG nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsm55j4o6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e3fc8d91-13d4-4f62-9b6a-526a7a22e155',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:18:38 compute-0 nova_compute[192903]: 2025-10-06 14:18:38.553 2 DEBUG nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Creating instance directory: /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:18:38 compute-0 nova_compute[192903]: 2025-10-06 14:18:38.553 2 DEBUG nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Creating disk.info with the contents: {'/var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk': 'qcow2', '/var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:18:38 compute-0 nova_compute[192903]: 2025-10-06 14:18:38.554 2 DEBUG nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:18:38 compute-0 nova_compute[192903]: 2025-10-06 14:18:38.554 2 DEBUG nova.objects.instance [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid e3fc8d91-13d4-4f62-9b6a-526a7a22e155 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.061 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.067 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.069 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.141 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.143 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.144 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.145 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.149 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.149 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.220 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.221 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:39 compute-0 podman[223490]: 2025-10-06 14:18:39.236331379 +0000 UTC m=+0.081264335 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:18:39 compute-0 podman[223498]: 2025-10-06 14:18:39.263929214 +0000 UTC m=+0.098229181 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:18:39 compute-0 podman[223488]: 2025-10-06 14:18:39.264844898 +0000 UTC m=+0.124979353 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.266 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.267 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.267 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:39 compute-0 podman[223489]: 2025-10-06 14:18:39.276767441 +0000 UTC m=+0.127939461 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.324 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.325 2 DEBUG nova.virt.disk.api [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.326 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.406 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.407 2 DEBUG nova.virt.disk.api [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.407 2 DEBUG nova.objects.instance [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid e3fc8d91-13d4-4f62-9b6a-526a7a22e155 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.913 2 DEBUG nova.objects.base [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<e3fc8d91-13d4-4f62-9b6a-526a7a22e155> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.914 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.955 2 DEBUG oslo_concurrency.processutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk.config 497664" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.956 2 DEBUG nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.957 2 DEBUG nova.virt.libvirt.vif [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:17:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2031034617',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2031034617',id=18,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:17:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-0qp1mjz8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:17:56Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=e3fc8d91-13d4-4f62-9b6a-526a7a22e155,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d82b1892-0a97-4309-a337-b9f68f727ea7", "address": "fa:16:3e:e3:07:0e", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd82b1892-0a", "ovs_interfaceid": "d82b1892-0a97-4309-a337-b9f68f727ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.958 2 DEBUG nova.network.os_vif_util [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "d82b1892-0a97-4309-a337-b9f68f727ea7", "address": "fa:16:3e:e3:07:0e", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd82b1892-0a", "ovs_interfaceid": "d82b1892-0a97-4309-a337-b9f68f727ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.959 2 DEBUG nova.network.os_vif_util [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:07:0e,bridge_name='br-int',has_traffic_filtering=True,id=d82b1892-0a97-4309-a337-b9f68f727ea7,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd82b1892-0a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.959 2 DEBUG os_vif [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:07:0e,bridge_name='br-int',has_traffic_filtering=True,id=d82b1892-0a97-4309-a337-b9f68f727ea7,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd82b1892-0a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '4faf365b-c2bf-5a68-8b92-10090c1d138d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:39 compute-0 nova_compute[192903]: 2025-10-06 14:18:39.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.006 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd82b1892-0a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.007 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd82b1892-0a, col_values=(('qos', UUID('822500c7-7fba-4c62-a906-12213435385d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.008 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd82b1892-0a, col_values=(('external_ids', {'iface-id': 'd82b1892-0a97-4309-a337-b9f68f727ea7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e3:07:0e', 'vm-uuid': 'e3fc8d91-13d4-4f62-9b6a-526a7a22e155'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:40 compute-0 NetworkManager[52035]: <info>  [1759760320.0111] manager: (tapd82b1892-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.021 2 INFO os_vif [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:07:0e,bridge_name='br-int',has_traffic_filtering=True,id=d82b1892-0a97-4309-a337-b9f68f727ea7,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd82b1892-0a')
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.022 2 DEBUG nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.023 2 DEBUG nova.compute.manager [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsm55j4o6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e3fc8d91-13d4-4f62-9b6a-526a7a22e155',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.024 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.152 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.679 2 DEBUG nova.network.neutron [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Port d82b1892-0a97-4309-a337-b9f68f727ea7 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:18:40 compute-0 nova_compute[192903]: 2025-10-06 14:18:40.692 2 DEBUG nova.compute.manager [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsm55j4o6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e3fc8d91-13d4-4f62-9b6a-526a7a22e155',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:18:43 compute-0 ovn_controller[95205]: 2025-10-06T14:18:43Z|00167|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 06 14:18:44 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 06 14:18:44 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 06 14:18:44 compute-0 kernel: tapd82b1892-0a: entered promiscuous mode
Oct 06 14:18:44 compute-0 NetworkManager[52035]: <info>  [1759760324.2783] manager: (tapd82b1892-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Oct 06 14:18:44 compute-0 nova_compute[192903]: 2025-10-06 14:18:44.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:44 compute-0 ovn_controller[95205]: 2025-10-06T14:18:44Z|00168|binding|INFO|Claiming lport d82b1892-0a97-4309-a337-b9f68f727ea7 for this additional chassis.
Oct 06 14:18:44 compute-0 ovn_controller[95205]: 2025-10-06T14:18:44Z|00169|binding|INFO|d82b1892-0a97-4309-a337-b9f68f727ea7: Claiming fa:16:3e:e3:07:0e 10.100.0.7
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.289 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:07:0e 10.100.0.7'], port_security=['fa:16:3e:e3:07:0e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e3fc8d91-13d4-4f62-9b6a-526a7a22e155', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d82b1892-0a97-4309-a337-b9f68f727ea7) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.291 104072 INFO neutron.agent.ovn.metadata.agent [-] Port d82b1892-0a97-4309-a337-b9f68f727ea7 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.292 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:18:44 compute-0 ovn_controller[95205]: 2025-10-06T14:18:44Z|00170|binding|INFO|Setting lport d82b1892-0a97-4309-a337-b9f68f727ea7 ovn-installed in OVS
Oct 06 14:18:44 compute-0 nova_compute[192903]: 2025-10-06 14:18:44.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:44 compute-0 nova_compute[192903]: 2025-10-06 14:18:44.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:44 compute-0 nova_compute[192903]: 2025-10-06 14:18:44.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.314 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bc296bf6-b056-452a-bdd9-1a6ea47c0206]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:44 compute-0 systemd-udevd[223623]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:18:44 compute-0 systemd-machined[152985]: New machine qemu-15-instance-00000012.
Oct 06 14:18:44 compute-0 NetworkManager[52035]: <info>  [1759760324.3345] device (tapd82b1892-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:18:44 compute-0 NetworkManager[52035]: <info>  [1759760324.3356] device (tapd82b1892-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:18:44 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000012.
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.350 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[e945a44d-6483-4234-bea5-2c53066048c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.355 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9c3080-a11b-44d6-abd2-1b800d0c3c2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:44 compute-0 nova_compute[192903]: 2025-10-06 14:18:44.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.395 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[741b9b95-efdf-4090-b72a-77fbffff14ce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.414 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5a557c40-f962-40ea-9322-4547e8a11b66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475525, 'reachable_time': 33628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223636, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.440 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[928a9a1a-9fca-4a65-9d0a-4d55d59a53c8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475539, 'tstamp': 475539}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223638, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475544, 'tstamp': 475544}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223638, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.443 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:44 compute-0 nova_compute[192903]: 2025-10-06 14:18:44.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:44 compute-0 nova_compute[192903]: 2025-10-06 14:18:44.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.447 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.447 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.447 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.448 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:18:44 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:18:44.449 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7520a7-7b7a-4b2a-9c75-273df646896b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:18:45 compute-0 nova_compute[192903]: 2025-10-06 14:18:45.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:46 compute-0 ovn_controller[95205]: 2025-10-06T14:18:46Z|00171|binding|INFO|Claiming lport d82b1892-0a97-4309-a337-b9f68f727ea7 for this chassis.
Oct 06 14:18:46 compute-0 ovn_controller[95205]: 2025-10-06T14:18:46Z|00172|binding|INFO|d82b1892-0a97-4309-a337-b9f68f727ea7: Claiming fa:16:3e:e3:07:0e 10.100.0.7
Oct 06 14:18:46 compute-0 ovn_controller[95205]: 2025-10-06T14:18:46Z|00173|binding|INFO|Setting lport d82b1892-0a97-4309-a337-b9f68f727ea7 up in Southbound
Oct 06 14:18:48 compute-0 nova_compute[192903]: 2025-10-06 14:18:48.459 2 INFO nova.compute.manager [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Post operation of migration started
Oct 06 14:18:48 compute-0 nova_compute[192903]: 2025-10-06 14:18:48.460 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:49 compute-0 nova_compute[192903]: 2025-10-06 14:18:49.153 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:49 compute-0 nova_compute[192903]: 2025-10-06 14:18:49.154 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:49 compute-0 nova_compute[192903]: 2025-10-06 14:18:49.236 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-e3fc8d91-13d4-4f62-9b6a-526a7a22e155" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:18:49 compute-0 nova_compute[192903]: 2025-10-06 14:18:49.236 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-e3fc8d91-13d4-4f62-9b6a-526a7a22e155" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:18:49 compute-0 nova_compute[192903]: 2025-10-06 14:18:49.237 2 DEBUG nova.network.neutron [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:18:49 compute-0 nova_compute[192903]: 2025-10-06 14:18:49.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:49 compute-0 nova_compute[192903]: 2025-10-06 14:18:49.743 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:50 compute-0 nova_compute[192903]: 2025-10-06 14:18:50.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:50 compute-0 podman[223660]: 2025-10-06 14:18:50.226308495 +0000 UTC m=+0.082221450 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:18:50 compute-0 nova_compute[192903]: 2025-10-06 14:18:50.275 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:50 compute-0 nova_compute[192903]: 2025-10-06 14:18:50.427 2 DEBUG nova.network.neutron [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Updating instance_info_cache with network_info: [{"id": "d82b1892-0a97-4309-a337-b9f68f727ea7", "address": "fa:16:3e:e3:07:0e", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd82b1892-0a", "ovs_interfaceid": "d82b1892-0a97-4309-a337-b9f68f727ea7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:18:50 compute-0 nova_compute[192903]: 2025-10-06 14:18:50.934 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-e3fc8d91-13d4-4f62-9b6a-526a7a22e155" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:18:51 compute-0 nova_compute[192903]: 2025-10-06 14:18:51.452 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:51 compute-0 nova_compute[192903]: 2025-10-06 14:18:51.452 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:51 compute-0 nova_compute[192903]: 2025-10-06 14:18:51.453 2 DEBUG oslo_concurrency.lockutils [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:51 compute-0 nova_compute[192903]: 2025-10-06 14:18:51.458 2 INFO nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:18:51 compute-0 virtqemud[192802]: Domain id=15 name='instance-00000012' uuid=e3fc8d91-13d4-4f62-9b6a-526a7a22e155 is tainted: custom-monitor
Oct 06 14:18:52 compute-0 nova_compute[192903]: 2025-10-06 14:18:52.466 2 INFO nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:18:53 compute-0 nova_compute[192903]: 2025-10-06 14:18:53.471 2 INFO nova.virt.libvirt.driver [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:18:53 compute-0 nova_compute[192903]: 2025-10-06 14:18:53.476 2 DEBUG nova.compute.manager [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:18:53 compute-0 nova_compute[192903]: 2025-10-06 14:18:53.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:18:53 compute-0 nova_compute[192903]: 2025-10-06 14:18:53.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 14:18:53 compute-0 nova_compute[192903]: 2025-10-06 14:18:53.987 2 DEBUG nova.objects.instance [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:18:54 compute-0 nova_compute[192903]: 2025-10-06 14:18:54.092 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 14:18:54 compute-0 podman[223681]: 2025-10-06 14:18:54.235267811 +0000 UTC m=+0.091276048 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 06 14:18:54 compute-0 nova_compute[192903]: 2025-10-06 14:18:54.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:55 compute-0 nova_compute[192903]: 2025-10-06 14:18:55.006 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:55 compute-0 nova_compute[192903]: 2025-10-06 14:18:55.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:55 compute-0 nova_compute[192903]: 2025-10-06 14:18:55.092 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:18:55 compute-0 nova_compute[192903]: 2025-10-06 14:18:55.150 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:55 compute-0 nova_compute[192903]: 2025-10-06 14:18:55.151 2 WARNING neutronclient.v2_0.client [None req-fd54caf7-7c52-4a21-9922-4a975406fd2c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:18:55 compute-0 nova_compute[192903]: 2025-10-06 14:18:55.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:18:56 compute-0 nova_compute[192903]: 2025-10-06 14:18:56.100 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:56 compute-0 nova_compute[192903]: 2025-10-06 14:18:56.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:56 compute-0 nova_compute[192903]: 2025-10-06 14:18:56.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:18:56 compute-0 nova_compute[192903]: 2025-10-06 14:18:56.101 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.205 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.274 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.275 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.331 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.338 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.394 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.394 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.448 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.605 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.606 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.627 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.628 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5542MB free_disk=73.24429702758789GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.628 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:18:57 compute-0 nova_compute[192903]: 2025-10-06 14:18:57.629 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:18:58 compute-0 nova_compute[192903]: 2025-10-06 14:18:58.650 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Applying migration context for instance e3fc8d91-13d4-4f62-9b6a-526a7a22e155 as it has an incoming, in-progress migration b7914764-2efb-4c4f-8d19-ca7ce436d829. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 06 14:18:58 compute-0 nova_compute[192903]: 2025-10-06 14:18:58.651 2 DEBUG nova.objects.instance [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:18:59 compute-0 nova_compute[192903]: 2025-10-06 14:18:59.162 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 06 14:18:59 compute-0 nova_compute[192903]: 2025-10-06 14:18:59.193 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 1f122883-4f1c-41b8-859c-862157a7bb48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:18:59 compute-0 nova_compute[192903]: 2025-10-06 14:18:59.194 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance e3fc8d91-13d4-4f62-9b6a-526a7a22e155 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:18:59 compute-0 nova_compute[192903]: 2025-10-06 14:18:59.194 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:18:59 compute-0 nova_compute[192903]: 2025-10-06 14:18:59.195 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:18:57 up  1:19,  0 user,  load average: 0.73, 0.41, 0.41\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_8f3f3b7d20fc4715811486da569fc0ab': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:18:59 compute-0 nova_compute[192903]: 2025-10-06 14:18:59.249 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:18:59 compute-0 nova_compute[192903]: 2025-10-06 14:18:59.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:18:59 compute-0 podman[203308]: time="2025-10-06T14:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:18:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20763 "" "Go-http-client/1.1"
Oct 06 14:18:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Oct 06 14:18:59 compute-0 nova_compute[192903]: 2025-10-06 14:18:59.757 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.239 2 DEBUG oslo_concurrency.lockutils [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "1f122883-4f1c-41b8-859c-862157a7bb48" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.240 2 DEBUG oslo_concurrency.lockutils [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.240 2 DEBUG oslo_concurrency.lockutils [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.241 2 DEBUG oslo_concurrency.lockutils [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.241 2 DEBUG oslo_concurrency.lockutils [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.262 2 INFO nova.compute.manager [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Terminating instance
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.271 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.271 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.642s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.785 2 DEBUG nova.compute.manager [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:19:00 compute-0 kernel: tap361eb8b6-32 (unregistering): left promiscuous mode
Oct 06 14:19:00 compute-0 NetworkManager[52035]: <info>  [1759760340.8137] device (tap361eb8b6-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:19:00 compute-0 ovn_controller[95205]: 2025-10-06T14:19:00Z|00174|binding|INFO|Releasing lport 361eb8b6-322f-4593-8494-22ca046eaab3 from this chassis (sb_readonly=0)
Oct 06 14:19:00 compute-0 ovn_controller[95205]: 2025-10-06T14:19:00Z|00175|binding|INFO|Setting lport 361eb8b6-322f-4593-8494-22ca046eaab3 down in Southbound
Oct 06 14:19:00 compute-0 ovn_controller[95205]: 2025-10-06T14:19:00Z|00176|binding|INFO|Removing iface tap361eb8b6-32 ovn-installed in OVS
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.873 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:b1:40 10.100.0.14'], port_security=['fa:16:3e:3c:b1:40 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1f122883-4f1c-41b8-859c-862157a7bb48', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=361eb8b6-322f-4593-8494-22ca046eaab3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.874 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 361eb8b6-322f-4593-8494-22ca046eaab3 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.875 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.892 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9c4708e5-6d00-4fd9-920c-7f48c712ab7d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:00 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Deactivated successfully.
Oct 06 14:19:00 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Consumed 14.214s CPU time.
Oct 06 14:19:00 compute-0 systemd-machined[152985]: Machine qemu-14-instance-00000013 terminated.
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.919 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[17bb1c87-4cc4-4161-9398-d65112281869]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.921 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[86e16944-11d4-4c31-b7a9-4afbe0656e48]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.949 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[f60b289c-eb1f-41c2-9bde-6051c7798010]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.965 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[69893339-404e-48ad-aad4-00ec1318ea83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475525, 'reachable_time': 33628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223729, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.981 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ff98ecc0-c53d-4e8f-a851-5bcf7b0ab44e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475539, 'tstamp': 475539}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223730, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475544, 'tstamp': 475544}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223730, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.982 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:00 compute-0 nova_compute[192903]: 2025-10-06 14:19:00.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.989 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.989 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.989 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.989 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:19:00 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:00.991 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6695ca9e-d517-4e93-a2e4-0dd544ed7d8c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:01 compute-0 nova_compute[192903]: 2025-10-06 14:19:01.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:01 compute-0 nova_compute[192903]: 2025-10-06 14:19:01.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:01 compute-0 nova_compute[192903]: 2025-10-06 14:19:01.053 2 INFO nova.virt.libvirt.driver [-] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Instance destroyed successfully.
Oct 06 14:19:01 compute-0 nova_compute[192903]: 2025-10-06 14:19:01.053 2 DEBUG nova.objects.instance [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'resources' on Instance uuid 1f122883-4f1c-41b8-859c-862157a7bb48 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:19:01 compute-0 nova_compute[192903]: 2025-10-06 14:19:01.094 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:01 compute-0 openstack_network_exporter[205500]: ERROR   14:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:19:01 compute-0 openstack_network_exporter[205500]: ERROR   14:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:19:01 compute-0 openstack_network_exporter[205500]: ERROR   14:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:19:01 compute-0 openstack_network_exporter[205500]: ERROR   14:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:19:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:19:01 compute-0 openstack_network_exporter[205500]: ERROR   14:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:19:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.699 2 DEBUG nova.compute.manager [req-58a8d5ae-5a35-4765-a5c2-040f326ceb2a req-e13cd44e-f2ec-4b02-95ce-39a89347a129 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Received event network-vif-unplugged-361eb8b6-322f-4593-8494-22ca046eaab3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.699 2 DEBUG oslo_concurrency.lockutils [req-58a8d5ae-5a35-4765-a5c2-040f326ceb2a req-e13cd44e-f2ec-4b02-95ce-39a89347a129 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.699 2 DEBUG oslo_concurrency.lockutils [req-58a8d5ae-5a35-4765-a5c2-040f326ceb2a req-e13cd44e-f2ec-4b02-95ce-39a89347a129 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.699 2 DEBUG oslo_concurrency.lockutils [req-58a8d5ae-5a35-4765-a5c2-040f326ceb2a req-e13cd44e-f2ec-4b02-95ce-39a89347a129 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.700 2 DEBUG nova.compute.manager [req-58a8d5ae-5a35-4765-a5c2-040f326ceb2a req-e13cd44e-f2ec-4b02-95ce-39a89347a129 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] No waiting events found dispatching network-vif-unplugged-361eb8b6-322f-4593-8494-22ca046eaab3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.700 2 DEBUG nova.compute.manager [req-58a8d5ae-5a35-4765-a5c2-040f326ceb2a req-e13cd44e-f2ec-4b02-95ce-39a89347a129 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Received event network-vif-unplugged-361eb8b6-322f-4593-8494-22ca046eaab3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.767 2 DEBUG nova.virt.libvirt.vif [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:17:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2116639016',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2116639016',id=19,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:18:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-a95ab0ai',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:18:15Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=1f122883-4f1c-41b8-859c-862157a7bb48,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.768 2 DEBUG nova.network.os_vif_util [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "361eb8b6-322f-4593-8494-22ca046eaab3", "address": "fa:16:3e:3c:b1:40", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap361eb8b6-32", "ovs_interfaceid": "361eb8b6-322f-4593-8494-22ca046eaab3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.768 2 DEBUG nova.network.os_vif_util [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:b1:40,bridge_name='br-int',has_traffic_filtering=True,id=361eb8b6-322f-4593-8494-22ca046eaab3,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap361eb8b6-32') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.769 2 DEBUG os_vif [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:b1:40,bridge_name='br-int',has_traffic_filtering=True,id=361eb8b6-322f-4593-8494-22ca046eaab3,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap361eb8b6-32') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap361eb8b6-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.776 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=39025d5d-a3ad-4550-8cd2-46d368106d85) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.781 2 INFO os_vif [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:b1:40,bridge_name='br-int',has_traffic_filtering=True,id=361eb8b6-322f-4593-8494-22ca046eaab3,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap361eb8b6-32')
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.781 2 INFO nova.virt.libvirt.driver [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Deleting instance files /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48_del
Oct 06 14:19:02 compute-0 nova_compute[192903]: 2025-10-06 14:19:02.782 2 INFO nova.virt.libvirt.driver [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Deletion of /var/lib/nova/instances/1f122883-4f1c-41b8-859c-862157a7bb48_del complete
Oct 06 14:19:03 compute-0 nova_compute[192903]: 2025-10-06 14:19:03.412 2 INFO nova.compute.manager [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Took 2.63 seconds to destroy the instance on the hypervisor.
Oct 06 14:19:03 compute-0 nova_compute[192903]: 2025-10-06 14:19:03.412 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:19:03 compute-0 nova_compute[192903]: 2025-10-06 14:19:03.413 2 DEBUG nova.compute.manager [-] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:19:03 compute-0 nova_compute[192903]: 2025-10-06 14:19:03.413 2 DEBUG nova.network.neutron [-] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:19:03 compute-0 nova_compute[192903]: 2025-10-06 14:19:03.414 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:19:03 compute-0 nova_compute[192903]: 2025-10-06 14:19:03.681 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:03 compute-0 nova_compute[192903]: 2025-10-06 14:19:03.681 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:03 compute-0 nova_compute[192903]: 2025-10-06 14:19:03.681 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:03 compute-0 nova_compute[192903]: 2025-10-06 14:19:03.682 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.179 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.776 2 DEBUG nova.compute.manager [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Received event network-vif-unplugged-361eb8b6-322f-4593-8494-22ca046eaab3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.776 2 DEBUG oslo_concurrency.lockutils [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.776 2 DEBUG oslo_concurrency.lockutils [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.777 2 DEBUG oslo_concurrency.lockutils [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.777 2 DEBUG nova.compute.manager [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] No waiting events found dispatching network-vif-unplugged-361eb8b6-322f-4593-8494-22ca046eaab3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.777 2 DEBUG nova.compute.manager [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Received event network-vif-unplugged-361eb8b6-322f-4593-8494-22ca046eaab3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.777 2 DEBUG nova.compute.manager [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Received event network-vif-deleted-361eb8b6-322f-4593-8494-22ca046eaab3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.777 2 INFO nova.compute.manager [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Neutron deleted interface 361eb8b6-322f-4593-8494-22ca046eaab3; detaching it from the instance and deleting it from the info cache
Oct 06 14:19:04 compute-0 nova_compute[192903]: 2025-10-06 14:19:04.777 2 DEBUG nova.network.neutron [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:19:05 compute-0 nova_compute[192903]: 2025-10-06 14:19:05.022 2 DEBUG nova.network.neutron [-] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:19:05 compute-0 nova_compute[192903]: 2025-10-06 14:19:05.287 2 DEBUG nova.compute.manager [req-9c8996e4-dd5e-47cd-a13d-5100e04c5235 req-47ed607a-0325-4dc3-a42d-24021a33d858 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Detach interface failed, port_id=361eb8b6-322f-4593-8494-22ca046eaab3, reason: Instance 1f122883-4f1c-41b8-859c-862157a7bb48 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:19:05 compute-0 nova_compute[192903]: 2025-10-06 14:19:05.528 2 INFO nova.compute.manager [-] [instance: 1f122883-4f1c-41b8-859c-862157a7bb48] Took 2.11 seconds to deallocate network for instance.
Oct 06 14:19:06 compute-0 nova_compute[192903]: 2025-10-06 14:19:06.047 2 DEBUG oslo_concurrency.lockutils [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:06 compute-0 nova_compute[192903]: 2025-10-06 14:19:06.048 2 DEBUG oslo_concurrency.lockutils [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:06 compute-0 nova_compute[192903]: 2025-10-06 14:19:06.130 2 DEBUG nova.compute.provider_tree [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:19:06 compute-0 nova_compute[192903]: 2025-10-06 14:19:06.642 2 DEBUG nova.scheduler.client.report [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:19:07 compute-0 nova_compute[192903]: 2025-10-06 14:19:07.154 2 DEBUG oslo_concurrency.lockutils [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:07 compute-0 nova_compute[192903]: 2025-10-06 14:19:07.178 2 INFO nova.scheduler.client.report [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Deleted allocations for instance 1f122883-4f1c-41b8-859c-862157a7bb48
Oct 06 14:19:07 compute-0 nova_compute[192903]: 2025-10-06 14:19:07.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:07 compute-0 nova_compute[192903]: 2025-10-06 14:19:07.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:08 compute-0 nova_compute[192903]: 2025-10-06 14:19:08.212 2 DEBUG oslo_concurrency.lockutils [None req-bc49f385-530e-404f-ba73-4c6c0aedf01a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "1f122883-4f1c-41b8-859c-862157a7bb48" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.972s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:08 compute-0 nova_compute[192903]: 2025-10-06 14:19:08.899 2 DEBUG oslo_concurrency.lockutils [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:08 compute-0 nova_compute[192903]: 2025-10-06 14:19:08.899 2 DEBUG oslo_concurrency.lockutils [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:08 compute-0 nova_compute[192903]: 2025-10-06 14:19:08.900 2 DEBUG oslo_concurrency.lockutils [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:08 compute-0 nova_compute[192903]: 2025-10-06 14:19:08.900 2 DEBUG oslo_concurrency.lockutils [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:08 compute-0 nova_compute[192903]: 2025-10-06 14:19:08.900 2 DEBUG oslo_concurrency.lockutils [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:08 compute-0 nova_compute[192903]: 2025-10-06 14:19:08.913 2 INFO nova.compute.manager [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Terminating instance
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.433 2 DEBUG nova.compute.manager [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:19:09 compute-0 kernel: tapd82b1892-0a (unregistering): left promiscuous mode
Oct 06 14:19:09 compute-0 NetworkManager[52035]: <info>  [1759760349.4585] device (tapd82b1892-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00177|binding|INFO|Releasing lport d82b1892-0a97-4309-a337-b9f68f727ea7 from this chassis (sb_readonly=0)
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00178|binding|INFO|Setting lport d82b1892-0a97-4309-a337-b9f68f727ea7 down in Southbound
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00179|binding|INFO|Removing iface tapd82b1892-0a ovn-installed in OVS
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.479 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:07:0e 10.100.0.7'], port_security=['fa:16:3e:e3:07:0e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e3fc8d91-13d4-4f62-9b6a-526a7a22e155', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=d82b1892-0a97-4309-a337-b9f68f727ea7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.481 104072 INFO neutron.agent.ovn.metadata.agent [-] Port d82b1892-0a97-4309-a337-b9f68f727ea7 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.483 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.486 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5bdf91f5-40fa-4d99-9e81-cd407299d34d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.486 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace which is not needed anymore
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Deactivated successfully.
Oct 06 14:19:09 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Consumed 2.608s CPU time.
Oct 06 14:19:09 compute-0 systemd-machined[152985]: Machine qemu-15-instance-00000012 terminated.
Oct 06 14:19:09 compute-0 podman[223758]: 2025-10-06 14:19:09.621508122 +0000 UTC m=+0.083826362 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:19:09 compute-0 podman[223759]: 2025-10-06 14:19:09.624547562 +0000 UTC m=+0.080868235 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.632 2 DEBUG nova.compute.manager [req-56e9ec69-fe1d-4d73-84d4-e339eff7e44a req-63689cc8-f2de-4dc4-8968-c681a4a84402 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.633 2 DEBUG oslo_concurrency.lockutils [req-56e9ec69-fe1d-4d73-84d4-e339eff7e44a req-63689cc8-f2de-4dc4-8968-c681a4a84402 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.633 2 DEBUG oslo_concurrency.lockutils [req-56e9ec69-fe1d-4d73-84d4-e339eff7e44a req-63689cc8-f2de-4dc4-8968-c681a4a84402 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.634 2 DEBUG oslo_concurrency.lockutils [req-56e9ec69-fe1d-4d73-84d4-e339eff7e44a req-63689cc8-f2de-4dc4-8968-c681a4a84402 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.634 2 DEBUG nova.compute.manager [req-56e9ec69-fe1d-4d73-84d4-e339eff7e44a req-63689cc8-f2de-4dc4-8968-c681a4a84402 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] No waiting events found dispatching network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.634 2 DEBUG nova.compute.manager [req-56e9ec69-fe1d-4d73-84d4-e339eff7e44a req-63689cc8-f2de-4dc4-8968-c681a4a84402 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:19:09 compute-0 podman[223756]: 2025-10-06 14:19:09.642701269 +0000 UTC m=+0.123062503 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:19:09 compute-0 podman[223828]: 2025-10-06 14:19:09.653483862 +0000 UTC m=+0.040191167 container kill 6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 14:19:09 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[223417]: [NOTICE]   (223421) : haproxy version is 3.0.5-8e879a5
Oct 06 14:19:09 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[223417]: [NOTICE]   (223421) : path to executable is /usr/sbin/haproxy
Oct 06 14:19:09 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[223417]: [WARNING]  (223421) : Exiting Master process...
Oct 06 14:19:09 compute-0 kernel: tapd82b1892-0a: entered promiscuous mode
Oct 06 14:19:09 compute-0 NetworkManager[52035]: <info>  [1759760349.6591] manager: (tapd82b1892-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Oct 06 14:19:09 compute-0 systemd-udevd[223810]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:19:09 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[223417]: [ALERT]    (223421) : Current worker (223423) exited with code 143 (Terminated)
Oct 06 14:19:09 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[223417]: [WARNING]  (223421) : All workers exited. Exiting... (0)
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00180|binding|INFO|Claiming lport d82b1892-0a97-4309-a337-b9f68f727ea7 for this chassis.
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00181|binding|INFO|d82b1892-0a97-4309-a337-b9f68f727ea7: Claiming fa:16:3e:e3:07:0e 10.100.0.7
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 podman[223751]: 2025-10-06 14:19:09.663451944 +0000 UTC m=+0.143805198 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 14:19:09 compute-0 systemd[1]: libpod-6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73.scope: Deactivated successfully.
Oct 06 14:19:09 compute-0 kernel: tapd82b1892-0a (unregistering): left promiscuous mode
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.667 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:07:0e 10.100.0.7'], port_security=['fa:16:3e:e3:07:0e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e3fc8d91-13d4-4f62-9b6a-526a7a22e155', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=d82b1892-0a97-4309-a337-b9f68f727ea7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00182|binding|INFO|Setting lport d82b1892-0a97-4309-a337-b9f68f727ea7 ovn-installed in OVS
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00183|binding|INFO|Setting lport d82b1892-0a97-4309-a337-b9f68f727ea7 up in Southbound
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00184|binding|INFO|Releasing lport d82b1892-0a97-4309-a337-b9f68f727ea7 from this chassis (sb_readonly=1)
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00185|if_status|INFO|Dropped 2 log messages in last 413 seconds (most recently, 413 seconds ago) due to excessive rate
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00186|if_status|INFO|Not setting lport d82b1892-0a97-4309-a337-b9f68f727ea7 down as sb is readonly
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00187|binding|INFO|Removing iface tapd82b1892-0a ovn-installed in OVS
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00188|binding|INFO|Releasing lport d82b1892-0a97-4309-a337-b9f68f727ea7 from this chassis (sb_readonly=0)
Oct 06 14:19:09 compute-0 ovn_controller[95205]: 2025-10-06T14:19:09Z|00189|binding|INFO|Setting lport d82b1892-0a97-4309-a337-b9f68f727ea7 down in Southbound
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 podman[223873]: 2025-10-06 14:19:09.710708175 +0000 UTC m=+0.029940628 container died 6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.713 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:07:0e 10.100.0.7'], port_security=['fa:16:3e:e3:07:0e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e3fc8d91-13d4-4f62-9b6a-526a7a22e155', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=d82b1892-0a97-4309-a337-b9f68f727ea7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.721 2 INFO nova.virt.libvirt.driver [-] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Instance destroyed successfully.
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.722 2 DEBUG nova.objects.instance [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'resources' on Instance uuid e3fc8d91-13d4-4f62-9b6a-526a7a22e155 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:19:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73-userdata-shm.mount: Deactivated successfully.
Oct 06 14:19:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-a5ae46bd9c6826ebf8b83eb53717f4cdeb109437659b56bb1745844731b3c22c-merged.mount: Deactivated successfully.
Oct 06 14:19:09 compute-0 podman[223873]: 2025-10-06 14:19:09.746843233 +0000 UTC m=+0.066075686 container cleanup 6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 06 14:19:09 compute-0 systemd[1]: libpod-conmon-6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73.scope: Deactivated successfully.
Oct 06 14:19:09 compute-0 podman[223874]: 2025-10-06 14:19:09.769011486 +0000 UTC m=+0.077518037 container remove 6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.776 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4e137c11-af0f-44ac-b421-46a78560aa17]: (4, ("Mon Oct  6 02:19:09 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73)\n6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73\nMon Oct  6 02:19:09 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73)\n6b5a3535aca403a3b4ce7045dbe8d4d62620876d1ad968322c9f4a9ea0a84c73\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.778 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3cd31d-6a30-47ea-86f7-02251f0a2964]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.779 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.780 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[364c0094-0acf-4aa1-955a-0cefb4e38778]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.781 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 kernel: tap55ccf1b2-d0: left promiscuous mode
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 nova_compute[192903]: 2025-10-06 14:19:09.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.803 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e87cd922-08dc-477c-a856-058c0157345e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.828 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bb21b208-9377-43af-b4ca-469ea7eb0784]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.830 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[95c4f581-fb62-4ba3-bbbc-b65fb192cff3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.851 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bccd3ec2-b47f-4b1b-9eed-4ea527983b44]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475519, 'reachable_time': 37087, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223914, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.854 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.854 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[a54022c4-4449-400a-8d18-7da18923a722]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d55ccf1b2\x2dd24e\x2d4063\x2db15b\x2d60a65227d75e.mount: Deactivated successfully.
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.856 104072 INFO neutron.agent.ovn.metadata.agent [-] Port d82b1892-0a97-4309-a337-b9f68f727ea7 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.857 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.858 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce556fd-774c-492e-ac93-f2fb1472a5d6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.859 104072 INFO neutron.agent.ovn.metadata.agent [-] Port d82b1892-0a97-4309-a337-b9f68f727ea7 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.860 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:19:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:09.861 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c74414e0-2515-4b3e-8468-b9f47777a04d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.229 2 DEBUG nova.virt.libvirt.vif [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:17:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2031034617',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2031034617',id=18,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:17:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-0qp1mjz8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:18:54Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=e3fc8d91-13d4-4f62-9b6a-526a7a22e155,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d82b1892-0a97-4309-a337-b9f68f727ea7", "address": "fa:16:3e:e3:07:0e", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd82b1892-0a", "ovs_interfaceid": "d82b1892-0a97-4309-a337-b9f68f727ea7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.230 2 DEBUG nova.network.os_vif_util [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "d82b1892-0a97-4309-a337-b9f68f727ea7", "address": "fa:16:3e:e3:07:0e", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd82b1892-0a", "ovs_interfaceid": "d82b1892-0a97-4309-a337-b9f68f727ea7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.231 2 DEBUG nova.network.os_vif_util [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e3:07:0e,bridge_name='br-int',has_traffic_filtering=True,id=d82b1892-0a97-4309-a337-b9f68f727ea7,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd82b1892-0a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.231 2 DEBUG os_vif [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:07:0e,bridge_name='br-int',has_traffic_filtering=True,id=d82b1892-0a97-4309-a337-b9f68f727ea7,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd82b1892-0a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.232 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd82b1892-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.236 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=822500c7-7fba-4c62-a906-12213435385d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.239 2 INFO os_vif [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:07:0e,bridge_name='br-int',has_traffic_filtering=True,id=d82b1892-0a97-4309-a337-b9f68f727ea7,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd82b1892-0a')
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.240 2 INFO nova.virt.libvirt.driver [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Deleting instance files /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155_del
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.240 2 INFO nova.virt.libvirt.driver [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Deletion of /var/lib/nova/instances/e3fc8d91-13d4-4f62-9b6a-526a7a22e155_del complete
Oct 06 14:19:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:10.301 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:10 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:10.303 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.759 2 INFO nova.compute.manager [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.759 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.760 2 DEBUG nova.compute.manager [-] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.760 2 DEBUG nova.network.neutron [-] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:19:10 compute-0 nova_compute[192903]: 2025-10-06 14:19:10.760 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.193 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:19:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:11.304 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:11.383 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:11.384 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:11.384 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.555 2 DEBUG nova.compute.manager [req-d38878be-b102-4b45-b6db-ebb9d35c3b4a req-e96120b5-b5b5-4cc5-b72d-22528efcdfc2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-deleted-d82b1892-0a97-4309-a337-b9f68f727ea7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.556 2 INFO nova.compute.manager [req-d38878be-b102-4b45-b6db-ebb9d35c3b4a req-e96120b5-b5b5-4cc5-b72d-22528efcdfc2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Neutron deleted interface d82b1892-0a97-4309-a337-b9f68f727ea7; detaching it from the instance and deleting it from the info cache
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.556 2 DEBUG nova.network.neutron [req-d38878be-b102-4b45-b6db-ebb9d35c3b4a req-e96120b5-b5b5-4cc5-b72d-22528efcdfc2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.691 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.692 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.692 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.693 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.694 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] No waiting events found dispatching network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.694 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.694 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-plugged-d82b1892-0a97-4309-a337-b9f68f727ea7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.695 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.695 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.696 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.696 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] No waiting events found dispatching network-vif-plugged-d82b1892-0a97-4309-a337-b9f68f727ea7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.696 2 WARNING nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received unexpected event network-vif-plugged-d82b1892-0a97-4309-a337-b9f68f727ea7 for instance with vm_state active and task_state deleting.
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.697 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-plugged-d82b1892-0a97-4309-a337-b9f68f727ea7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.697 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.698 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.698 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.698 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] No waiting events found dispatching network-vif-plugged-d82b1892-0a97-4309-a337-b9f68f727ea7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.699 2 WARNING nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received unexpected event network-vif-plugged-d82b1892-0a97-4309-a337-b9f68f727ea7 for instance with vm_state active and task_state deleting.
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.699 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.700 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.700 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.700 2 DEBUG oslo_concurrency.lockutils [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.701 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] No waiting events found dispatching network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:19:11 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.701 2 DEBUG nova.compute.manager [req-0358802a-1f27-4caf-bd7e-3613af15b034 req-29d7b569-4d45-4b60-83b0-8ae25790f317 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:19:12 compute-0 nova_compute[192903]: 2025-10-06 14:19:11.999 2 DEBUG nova.network.neutron [-] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:19:12 compute-0 nova_compute[192903]: 2025-10-06 14:19:12.066 2 DEBUG nova.compute.manager [req-d38878be-b102-4b45-b6db-ebb9d35c3b4a req-e96120b5-b5b5-4cc5-b72d-22528efcdfc2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Detach interface failed, port_id=d82b1892-0a97-4309-a337-b9f68f727ea7, reason: Instance e3fc8d91-13d4-4f62-9b6a-526a7a22e155 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:19:12 compute-0 nova_compute[192903]: 2025-10-06 14:19:12.510 2 INFO nova.compute.manager [-] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Took 1.75 seconds to deallocate network for instance.
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.033 2 DEBUG oslo_concurrency.lockutils [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.034 2 DEBUG oslo_concurrency.lockutils [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.107 2 DEBUG nova.compute.provider_tree [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.615 2 DEBUG nova.scheduler.client.report [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.765 2 DEBUG nova.compute.manager [req-95ab3c4b-eee6-4a67-b29e-8153646adbc2 req-610c9e4f-04e0-4923-b2b3-068181acfbf8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received event network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.765 2 DEBUG oslo_concurrency.lockutils [req-95ab3c4b-eee6-4a67-b29e-8153646adbc2 req-610c9e4f-04e0-4923-b2b3-068181acfbf8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.766 2 DEBUG oslo_concurrency.lockutils [req-95ab3c4b-eee6-4a67-b29e-8153646adbc2 req-610c9e4f-04e0-4923-b2b3-068181acfbf8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.766 2 DEBUG oslo_concurrency.lockutils [req-95ab3c4b-eee6-4a67-b29e-8153646adbc2 req-610c9e4f-04e0-4923-b2b3-068181acfbf8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.767 2 DEBUG nova.compute.manager [req-95ab3c4b-eee6-4a67-b29e-8153646adbc2 req-610c9e4f-04e0-4923-b2b3-068181acfbf8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] No waiting events found dispatching network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:19:13 compute-0 nova_compute[192903]: 2025-10-06 14:19:13.767 2 WARNING nova.compute.manager [req-95ab3c4b-eee6-4a67-b29e-8153646adbc2 req-610c9e4f-04e0-4923-b2b3-068181acfbf8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: e3fc8d91-13d4-4f62-9b6a-526a7a22e155] Received unexpected event network-vif-unplugged-d82b1892-0a97-4309-a337-b9f68f727ea7 for instance with vm_state deleted and task_state None.
Oct 06 14:19:14 compute-0 nova_compute[192903]: 2025-10-06 14:19:14.128 2 DEBUG oslo_concurrency.lockutils [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:14 compute-0 nova_compute[192903]: 2025-10-06 14:19:14.161 2 INFO nova.scheduler.client.report [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Deleted allocations for instance e3fc8d91-13d4-4f62-9b6a-526a7a22e155
Oct 06 14:19:14 compute-0 nova_compute[192903]: 2025-10-06 14:19:14.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:15 compute-0 nova_compute[192903]: 2025-10-06 14:19:15.188 2 DEBUG oslo_concurrency.lockutils [None req-ad0cd3ec-f903-4710-9001-92dabe831199 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "e3fc8d91-13d4-4f62-9b6a-526a7a22e155" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.288s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:15 compute-0 nova_compute[192903]: 2025-10-06 14:19:15.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:16 compute-0 nova_compute[192903]: 2025-10-06 14:19:16.153 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:17 compute-0 nova_compute[192903]: 2025-10-06 14:19:17.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:17 compute-0 nova_compute[192903]: 2025-10-06 14:19:17.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 14:19:19 compute-0 nova_compute[192903]: 2025-10-06 14:19:19.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:20 compute-0 nova_compute[192903]: 2025-10-06 14:19:20.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:21 compute-0 podman[223918]: 2025-10-06 14:19:21.222121374 +0000 UTC m=+0.084245083 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 06 14:19:24 compute-0 nova_compute[192903]: 2025-10-06 14:19:24.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:25 compute-0 podman[223941]: 2025-10-06 14:19:25.21090446 +0000 UTC m=+0.072068393 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 06 14:19:25 compute-0 nova_compute[192903]: 2025-10-06 14:19:25.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:29 compute-0 nova_compute[192903]: 2025-10-06 14:19:29.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:29 compute-0 podman[203308]: time="2025-10-06T14:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:19:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:19:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Oct 06 14:19:30 compute-0 nova_compute[192903]: 2025-10-06 14:19:30.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:31 compute-0 openstack_network_exporter[205500]: ERROR   14:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:19:31 compute-0 openstack_network_exporter[205500]: ERROR   14:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:19:31 compute-0 openstack_network_exporter[205500]: ERROR   14:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:19:31 compute-0 openstack_network_exporter[205500]: ERROR   14:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:19:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:19:31 compute-0 openstack_network_exporter[205500]: ERROR   14:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:19:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:19:34 compute-0 nova_compute[192903]: 2025-10-06 14:19:34.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:35 compute-0 nova_compute[192903]: 2025-10-06 14:19:35.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:38 compute-0 nova_compute[192903]: 2025-10-06 14:19:38.144 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:39 compute-0 nova_compute[192903]: 2025-10-06 14:19:39.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:39 compute-0 nova_compute[192903]: 2025-10-06 14:19:39.984 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "0764cef7-e2fc-48c0-af26-f628def27fb4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:39 compute-0 nova_compute[192903]: 2025-10-06 14:19:39.984 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:40 compute-0 podman[223965]: 2025-10-06 14:19:40.230071231 +0000 UTC m=+0.077839845 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 06 14:19:40 compute-0 podman[223972]: 2025-10-06 14:19:40.230260346 +0000 UTC m=+0.070548273 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:19:40 compute-0 nova_compute[192903]: 2025-10-06 14:19:40.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:40 compute-0 podman[223964]: 2025-10-06 14:19:40.251223827 +0000 UTC m=+0.112339571 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:19:40 compute-0 podman[223966]: 2025-10-06 14:19:40.256511865 +0000 UTC m=+0.098948969 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 06 14:19:40 compute-0 nova_compute[192903]: 2025-10-06 14:19:40.491 2 DEBUG nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:19:41 compute-0 nova_compute[192903]: 2025-10-06 14:19:41.049 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:41 compute-0 nova_compute[192903]: 2025-10-06 14:19:41.049 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:41 compute-0 nova_compute[192903]: 2025-10-06 14:19:41.058 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:19:41 compute-0 nova_compute[192903]: 2025-10-06 14:19:41.058 2 INFO nova.compute.claims [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:19:42 compute-0 nova_compute[192903]: 2025-10-06 14:19:42.141 2 DEBUG nova.compute.provider_tree [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:19:42 compute-0 nova_compute[192903]: 2025-10-06 14:19:42.650 2 DEBUG nova.scheduler.client.report [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:19:43 compute-0 nova_compute[192903]: 2025-10-06 14:19:43.162 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:43 compute-0 nova_compute[192903]: 2025-10-06 14:19:43.163 2 DEBUG nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:19:43 compute-0 nova_compute[192903]: 2025-10-06 14:19:43.677 2 DEBUG nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:19:43 compute-0 nova_compute[192903]: 2025-10-06 14:19:43.678 2 DEBUG nova.network.neutron [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:19:43 compute-0 nova_compute[192903]: 2025-10-06 14:19:43.678 2 WARNING neutronclient.v2_0.client [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:19:43 compute-0 nova_compute[192903]: 2025-10-06 14:19:43.679 2 WARNING neutronclient.v2_0.client [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:19:44 compute-0 nova_compute[192903]: 2025-10-06 14:19:44.188 2 INFO nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:19:44 compute-0 nova_compute[192903]: 2025-10-06 14:19:44.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:44 compute-0 nova_compute[192903]: 2025-10-06 14:19:44.628 2 DEBUG nova.network.neutron [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Successfully created port: 45e2333d-26ca-45dd-947d-99a29d059183 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:19:44 compute-0 nova_compute[192903]: 2025-10-06 14:19:44.697 2 DEBUG nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.688 2 DEBUG nova.network.neutron [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Successfully updated port: 45e2333d-26ca-45dd-947d-99a29d059183 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.716 2 DEBUG nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.717 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.717 2 INFO nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Creating image(s)
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.717 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "/var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.718 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "/var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.718 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "/var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.719 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.721 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.722 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.769 2 DEBUG nova.compute.manager [req-53d542a3-a952-429d-93f5-fce05cf4242c req-60a388ba-2016-498a-bfea-2b74ca3208c1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Received event network-changed-45e2333d-26ca-45dd-947d-99a29d059183 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.770 2 DEBUG nova.compute.manager [req-53d542a3-a952-429d-93f5-fce05cf4242c req-60a388ba-2016-498a-bfea-2b74ca3208c1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Refreshing instance network info cache due to event network-changed-45e2333d-26ca-45dd-947d-99a29d059183. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.771 2 DEBUG oslo_concurrency.lockutils [req-53d542a3-a952-429d-93f5-fce05cf4242c req-60a388ba-2016-498a-bfea-2b74ca3208c1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-0764cef7-e2fc-48c0-af26-f628def27fb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.771 2 DEBUG oslo_concurrency.lockutils [req-53d542a3-a952-429d-93f5-fce05cf4242c req-60a388ba-2016-498a-bfea-2b74ca3208c1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-0764cef7-e2fc-48c0-af26-f628def27fb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.772 2 DEBUG nova.network.neutron [req-53d542a3-a952-429d-93f5-fce05cf4242c req-60a388ba-2016-498a-bfea-2b74ca3208c1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Refreshing network info cache for port 45e2333d-26ca-45dd-947d-99a29d059183 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.787 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.788 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.788 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.790 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.797 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.797 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.882 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.883 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.938 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.939 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.940 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.997 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:19:45 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.999 2 DEBUG nova.virt.disk.api [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Checking if we can resize image /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:45.999 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:46.059 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:46.061 2 DEBUG nova.virt.disk.api [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Cannot resize image /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:46.062 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:46.062 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Ensure instance console log exists: /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:46.063 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:46.064 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:46.064 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:46.195 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "refresh_cache-0764cef7-e2fc-48c0-af26-f628def27fb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:19:46 compute-0 nova_compute[192903]: 2025-10-06 14:19:46.282 2 WARNING neutronclient.v2_0.client [req-53d542a3-a952-429d-93f5-fce05cf4242c req-60a388ba-2016-498a-bfea-2b74ca3208c1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:19:47 compute-0 nova_compute[192903]: 2025-10-06 14:19:47.180 2 DEBUG nova.network.neutron [req-53d542a3-a952-429d-93f5-fce05cf4242c req-60a388ba-2016-498a-bfea-2b74ca3208c1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:19:47 compute-0 nova_compute[192903]: 2025-10-06 14:19:47.346 2 DEBUG nova.network.neutron [req-53d542a3-a952-429d-93f5-fce05cf4242c req-60a388ba-2016-498a-bfea-2b74ca3208c1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:19:47 compute-0 nova_compute[192903]: 2025-10-06 14:19:47.854 2 DEBUG oslo_concurrency.lockutils [req-53d542a3-a952-429d-93f5-fce05cf4242c req-60a388ba-2016-498a-bfea-2b74ca3208c1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-0764cef7-e2fc-48c0-af26-f628def27fb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:19:47 compute-0 nova_compute[192903]: 2025-10-06 14:19:47.855 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquired lock "refresh_cache-0764cef7-e2fc-48c0-af26-f628def27fb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:19:47 compute-0 nova_compute[192903]: 2025-10-06 14:19:47.855 2 DEBUG nova.network.neutron [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:19:49 compute-0 nova_compute[192903]: 2025-10-06 14:19:49.193 2 DEBUG nova.network.neutron [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:19:49 compute-0 nova_compute[192903]: 2025-10-06 14:19:49.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:49 compute-0 nova_compute[192903]: 2025-10-06 14:19:49.409 2 WARNING neutronclient.v2_0.client [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:19:49 compute-0 nova_compute[192903]: 2025-10-06 14:19:49.791 2 DEBUG nova.network.neutron [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Updating instance_info_cache with network_info: [{"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.299 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Releasing lock "refresh_cache-0764cef7-e2fc-48c0-af26-f628def27fb4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.300 2 DEBUG nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Instance network_info: |[{"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.305 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Start _get_guest_xml network_info=[{"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.311 2 WARNING nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.312 2 DEBUG nova.virt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-512152231', uuid='0764cef7-e2fc-48c0-af26-f628def27fb4'), owner=OwnerMeta(userid='98ee6da236ba42baa0fef11dcb52cbdd', username='tempest-TestExecuteStrategies-1255317741-project-admin', projectid='8f3f3b7d20fc4715811486da569fc0ab', projectname='tempest-TestExecuteStrategies-1255317741'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759760390.312628) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.318 2 DEBUG nova.virt.libvirt.host [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.318 2 DEBUG nova.virt.libvirt.host [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.321 2 DEBUG nova.virt.libvirt.host [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.322 2 DEBUG nova.virt.libvirt.host [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.322 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.323 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.323 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.324 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.324 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.324 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.324 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.324 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.325 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.325 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.325 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.325 2 DEBUG nova.virt.hardware [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.329 2 DEBUG nova.virt.libvirt.vif [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-512152231',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-512152231',id=21,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-d53597g0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:19:44Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=0764cef7-e2fc-48c0-af26-f628def27fb4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.330 2 DEBUG nova.network.os_vif_util [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.331 2 DEBUG nova.network.os_vif_util [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:5b:96,bridge_name='br-int',has_traffic_filtering=True,id=45e2333d-26ca-45dd-947d-99a29d059183,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45e2333d-26') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.331 2 DEBUG nova.objects.instance [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'pci_devices' on Instance uuid 0764cef7-e2fc-48c0-af26-f628def27fb4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.840 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <uuid>0764cef7-e2fc-48c0-af26-f628def27fb4</uuid>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <name>instance-00000015</name>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-512152231</nova:name>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:19:50</nova:creationTime>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:19:50 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:19:50 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         <nova:port uuid="45e2333d-26ca-45dd-947d-99a29d059183">
Oct 06 14:19:50 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <system>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <entry name="serial">0764cef7-e2fc-48c0-af26-f628def27fb4</entry>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <entry name="uuid">0764cef7-e2fc-48c0-af26-f628def27fb4</entry>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     </system>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <os>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   </os>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <features>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   </features>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk.config"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:b1:5b:96"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <target dev="tap45e2333d-26"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/console.log" append="off"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <video>
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     </video>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:19:50 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:19:50 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:19:50 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:19:50 compute-0 nova_compute[192903]: </domain>
Oct 06 14:19:50 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.841 2 DEBUG nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Preparing to wait for external event network-vif-plugged-45e2333d-26ca-45dd-947d-99a29d059183 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.841 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.842 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.842 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.843 2 DEBUG nova.virt.libvirt.vif [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-512152231',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-512152231',id=21,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-d53597g0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:19:44Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=0764cef7-e2fc-48c0-af26-f628def27fb4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.844 2 DEBUG nova.network.os_vif_util [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.845 2 DEBUG nova.network.os_vif_util [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:5b:96,bridge_name='br-int',has_traffic_filtering=True,id=45e2333d-26ca-45dd-947d-99a29d059183,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45e2333d-26') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.845 2 DEBUG os_vif [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:5b:96,bridge_name='br-int',has_traffic_filtering=True,id=45e2333d-26ca-45dd-947d-99a29d059183,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45e2333d-26') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.847 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.848 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.850 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8d5d1a36-4187-577f-818b-b130fc40e750', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45e2333d-26, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap45e2333d-26, col_values=(('qos', UUID('e7e205a1-8d7b-4eb8-bd8a-6adc84fb38d1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap45e2333d-26, col_values=(('external_ids', {'iface-id': '45e2333d-26ca-45dd-947d-99a29d059183', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:5b:96', 'vm-uuid': '0764cef7-e2fc-48c0-af26-f628def27fb4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:50 compute-0 NetworkManager[52035]: <info>  [1759760390.8632] manager: (tap45e2333d-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:50 compute-0 nova_compute[192903]: 2025-10-06 14:19:50.872 2 INFO os_vif [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:5b:96,bridge_name='br-int',has_traffic_filtering=True,id=45e2333d-26ca-45dd-947d-99a29d059183,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45e2333d-26')
Oct 06 14:19:52 compute-0 podman[224064]: 2025-10-06 14:19:52.214067721 +0000 UTC m=+0.080558266 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 06 14:19:52 compute-0 nova_compute[192903]: 2025-10-06 14:19:52.424 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:19:52 compute-0 nova_compute[192903]: 2025-10-06 14:19:52.424 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:19:52 compute-0 nova_compute[192903]: 2025-10-06 14:19:52.425 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No VIF found with MAC fa:16:3e:b1:5b:96, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:19:52 compute-0 nova_compute[192903]: 2025-10-06 14:19:52.425 2 INFO nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Using config drive
Oct 06 14:19:52 compute-0 nova_compute[192903]: 2025-10-06 14:19:52.938 2 WARNING neutronclient.v2_0.client [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.307 2 INFO nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Creating config drive at /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk.config
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.319 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpk6wjvkxz execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.465 2 DEBUG oslo_concurrency.processutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpk6wjvkxz" returned: 0 in 0.146s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:19:53 compute-0 kernel: tap45e2333d-26: entered promiscuous mode
Oct 06 14:19:53 compute-0 NetworkManager[52035]: <info>  [1759760393.5343] manager: (tap45e2333d-26): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Oct 06 14:19:53 compute-0 ovn_controller[95205]: 2025-10-06T14:19:53Z|00190|binding|INFO|Claiming lport 45e2333d-26ca-45dd-947d-99a29d059183 for this chassis.
Oct 06 14:19:53 compute-0 ovn_controller[95205]: 2025-10-06T14:19:53Z|00191|binding|INFO|45e2333d-26ca-45dd-947d-99a29d059183: Claiming fa:16:3e:b1:5b:96 10.100.0.7
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.543 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:5b:96 10.100.0.7'], port_security=['fa:16:3e:b1:5b:96 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0764cef7-e2fc-48c0-af26-f628def27fb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=45e2333d-26ca-45dd-947d-99a29d059183) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.544 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 45e2333d-26ca-45dd-947d-99a29d059183 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e bound to our chassis
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.545 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:53 compute-0 ovn_controller[95205]: 2025-10-06T14:19:53Z|00192|binding|INFO|Setting lport 45e2333d-26ca-45dd-947d-99a29d059183 ovn-installed in OVS
Oct 06 14:19:53 compute-0 ovn_controller[95205]: 2025-10-06T14:19:53Z|00193|binding|INFO|Setting lport 45e2333d-26ca-45dd-947d-99a29d059183 up in Southbound
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.560 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0537f5-573e-4d44-ad3a-e1d975d90b07]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 systemd-udevd[224102]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.561 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55ccf1b2-d1 in ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.564 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55ccf1b2-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.564 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[34381a55-3f0d-4dae-b8e7-a14b88d21795]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.565 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8e086c2f-92d7-4eb1-a936-be6027848ce7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 systemd-machined[152985]: New machine qemu-16-instance-00000015.
Oct 06 14:19:53 compute-0 NetworkManager[52035]: <info>  [1759760393.5775] device (tap45e2333d-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:19:53 compute-0 NetworkManager[52035]: <info>  [1759760393.5788] device (tap45e2333d-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:19:53 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000015.
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.580 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7cff34-31ee-492d-8051-c08b185cf679]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.596 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[339bcde6-cafa-44d0-b498-9ea3ccd88b63]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.625 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[50701ee2-7f12-4c43-b644-f1cddb147680]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.630 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9606efae-aba1-4d2c-b79f-d8d186395b2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 NetworkManager[52035]: <info>  [1759760393.6316] manager: (tap55ccf1b2-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Oct 06 14:19:53 compute-0 systemd-udevd[224106]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.657 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[9c4cb6e8-0975-432e-97d5-09659704e26d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.659 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[d0daeac6-f185-488e-a03a-3d27ed7e97df]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 NetworkManager[52035]: <info>  [1759760393.6761] device (tap55ccf1b2-d0): carrier: link connected
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.679 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[da0c7532-68db-4a4f-b144-529bc820cba2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.695 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3d8698-9bd8-43b0-9959-b5a92d247e78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485531, 'reachable_time': 29490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224135, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.707 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[fa393f46-1a86-46da-9188-377721cefb8e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:aab9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485531, 'tstamp': 485531}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224136, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.721 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9033f583-7f00-4ad0-98f9-cda8b634f3bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485531, 'reachable_time': 29490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224137, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.763 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[18044104-6d27-45ec-beed-31ee2aac723c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.861 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b54efdf7-2d34-4ae5-9be5-fac6e133bc41]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.863 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.863 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.864 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:53 compute-0 kernel: tap55ccf1b2-d0: entered promiscuous mode
Oct 06 14:19:53 compute-0 NetworkManager[52035]: <info>  [1759760393.8677] manager: (tap55ccf1b2-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.869 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:53 compute-0 ovn_controller[95205]: 2025-10-06T14:19:53Z|00194|binding|INFO|Releasing lport 0ee47753-a40c-4a21-a6ed-65093b6727d9 from this chassis (sb_readonly=0)
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.874 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[53dd46b1-1419-4778-a870-cfa1ebbaf596]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.875 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.875 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.875 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 55ccf1b2-d24e-4063-b15b-60a65227d75e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.875 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.876 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4ecf941e-7a97-4d63-9872-07753bbe47bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.876 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.877 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9f275f-fbe2-42f7-a1f4-05b24fefe21e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.877 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:19:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:19:53.878 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'env', 'PROCESS_TAG=haproxy-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55ccf1b2-d24e-4063-b15b-60a65227d75e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:19:53 compute-0 nova_compute[192903]: 2025-10-06 14:19:53.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.285 2 DEBUG nova.compute.manager [req-4920fdd1-2ded-4425-9971-ba3f23ac56bc req-62701a8e-1805-4928-aa30-c848ac32bf2d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Received event network-vif-plugged-45e2333d-26ca-45dd-947d-99a29d059183 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.287 2 DEBUG oslo_concurrency.lockutils [req-4920fdd1-2ded-4425-9971-ba3f23ac56bc req-62701a8e-1805-4928-aa30-c848ac32bf2d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.288 2 DEBUG oslo_concurrency.lockutils [req-4920fdd1-2ded-4425-9971-ba3f23ac56bc req-62701a8e-1805-4928-aa30-c848ac32bf2d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.289 2 DEBUG oslo_concurrency.lockutils [req-4920fdd1-2ded-4425-9971-ba3f23ac56bc req-62701a8e-1805-4928-aa30-c848ac32bf2d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.290 2 DEBUG nova.compute.manager [req-4920fdd1-2ded-4425-9971-ba3f23ac56bc req-62701a8e-1805-4928-aa30-c848ac32bf2d e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Processing event network-vif-plugged-45e2333d-26ca-45dd-947d-99a29d059183 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:19:54 compute-0 podman[224176]: 2025-10-06 14:19:54.323499985 +0000 UTC m=+0.077345802 container create 2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 06 14:19:54 compute-0 systemd[1]: Started libpod-conmon-2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2.scope.
Oct 06 14:19:54 compute-0 podman[224176]: 2025-10-06 14:19:54.291974937 +0000 UTC m=+0.045820764 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:19:54 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fd094fff50763222635f4ddbcdf442ac2b516996844745332f2b7bf7ad1f647/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:19:54 compute-0 podman[224176]: 2025-10-06 14:19:54.433913255 +0000 UTC m=+0.187759172 container init 2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:19:54 compute-0 podman[224176]: 2025-10-06 14:19:54.445107358 +0000 UTC m=+0.198953185 container start 2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 06 14:19:54 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224190]: [NOTICE]   (224194) : New worker (224196) forked
Oct 06 14:19:54 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224190]: [NOTICE]   (224194) : Loading success.
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.666 2 DEBUG nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.673 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.677 2 INFO nova.virt.libvirt.driver [-] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Instance spawned successfully.
Oct 06 14:19:54 compute-0 nova_compute[192903]: 2025-10-06 14:19:54.677 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.194 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.195 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.196 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.197 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.198 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.199 2 DEBUG nova.virt.libvirt.driver [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.710 2 INFO nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Took 9.99 seconds to spawn the instance on the hypervisor.
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.712 2 DEBUG nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:19:55 compute-0 nova_compute[192903]: 2025-10-06 14:19:55.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.098 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:19:56 compute-0 podman[224205]: 2025-10-06 14:19:56.234025955 +0000 UTC m=+0.094172154 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal)
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.254 2 INFO nova.compute.manager [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Took 15.25 seconds to build instance.
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.338 2 DEBUG nova.compute.manager [req-a0060bb8-98fe-4769-aa0a-295e8eca0801 req-872397a3-0fb3-4449-971e-6309727de3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Received event network-vif-plugged-45e2333d-26ca-45dd-947d-99a29d059183 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.339 2 DEBUG oslo_concurrency.lockutils [req-a0060bb8-98fe-4769-aa0a-295e8eca0801 req-872397a3-0fb3-4449-971e-6309727de3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.339 2 DEBUG oslo_concurrency.lockutils [req-a0060bb8-98fe-4769-aa0a-295e8eca0801 req-872397a3-0fb3-4449-971e-6309727de3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.340 2 DEBUG oslo_concurrency.lockutils [req-a0060bb8-98fe-4769-aa0a-295e8eca0801 req-872397a3-0fb3-4449-971e-6309727de3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.340 2 DEBUG nova.compute.manager [req-a0060bb8-98fe-4769-aa0a-295e8eca0801 req-872397a3-0fb3-4449-971e-6309727de3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] No waiting events found dispatching network-vif-plugged-45e2333d-26ca-45dd-947d-99a29d059183 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.340 2 WARNING nova.compute.manager [req-a0060bb8-98fe-4769-aa0a-295e8eca0801 req-872397a3-0fb3-4449-971e-6309727de3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Received unexpected event network-vif-plugged-45e2333d-26ca-45dd-947d-99a29d059183 for instance with vm_state active and task_state None.
Oct 06 14:19:56 compute-0 nova_compute[192903]: 2025-10-06 14:19:56.760 2 DEBUG oslo_concurrency.lockutils [None req-4bec7f2c-a38c-41be-82b9-2d4c218779a7 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.776s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.156 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.215 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.216 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.304 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.514 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.515 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.534 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.535 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5683MB free_disk=73.30122756958008GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.535 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:19:57 compute-0 nova_compute[192903]: 2025-10-06 14:19:57.535 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:19:58 compute-0 nova_compute[192903]: 2025-10-06 14:19:58.588 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 0764cef7-e2fc-48c0-af26-f628def27fb4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:19:58 compute-0 nova_compute[192903]: 2025-10-06 14:19:58.589 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:19:58 compute-0 nova_compute[192903]: 2025-10-06 14:19:58.589 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:19:57 up  1:20,  0 user,  load average: 0.75, 0.43, 0.41\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_8f3f3b7d20fc4715811486da569fc0ab': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:19:58 compute-0 nova_compute[192903]: 2025-10-06 14:19:58.674 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:19:59 compute-0 nova_compute[192903]: 2025-10-06 14:19:59.184 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:19:59 compute-0 nova_compute[192903]: 2025-10-06 14:19:59.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:19:59 compute-0 nova_compute[192903]: 2025-10-06 14:19:59.694 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:19:59 compute-0 nova_compute[192903]: 2025-10-06 14:19:59.695 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.159s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:19:59 compute-0 podman[203308]: time="2025-10-06T14:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:19:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:19:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Oct 06 14:20:00 compute-0 nova_compute[192903]: 2025-10-06 14:20:00.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:01 compute-0 openstack_network_exporter[205500]: ERROR   14:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:20:01 compute-0 openstack_network_exporter[205500]: ERROR   14:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:20:01 compute-0 openstack_network_exporter[205500]: ERROR   14:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:20:01 compute-0 openstack_network_exporter[205500]: ERROR   14:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:20:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:20:01 compute-0 openstack_network_exporter[205500]: ERROR   14:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:20:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:20:04 compute-0 nova_compute[192903]: 2025-10-06 14:20:04.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:05 compute-0 nova_compute[192903]: 2025-10-06 14:20:05.694 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:20:05 compute-0 nova_compute[192903]: 2025-10-06 14:20:05.695 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:20:05 compute-0 nova_compute[192903]: 2025-10-06 14:20:05.695 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:20:05 compute-0 nova_compute[192903]: 2025-10-06 14:20:05.695 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:20:05 compute-0 nova_compute[192903]: 2025-10-06 14:20:05.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:06 compute-0 nova_compute[192903]: 2025-10-06 14:20:06.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:20:06 compute-0 ovn_controller[95205]: 2025-10-06T14:20:06Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:5b:96 10.100.0.7
Oct 06 14:20:06 compute-0 ovn_controller[95205]: 2025-10-06T14:20:06Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:5b:96 10.100.0.7
Oct 06 14:20:07 compute-0 nova_compute[192903]: 2025-10-06 14:20:07.777 2 DEBUG nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Creating tmpfile /var/lib/nova/instances/tmp8lulxppg to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:20:07 compute-0 nova_compute[192903]: 2025-10-06 14:20:07.779 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:07 compute-0 nova_compute[192903]: 2025-10-06 14:20:07.784 2 DEBUG nova.compute.manager [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8lulxppg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:20:09 compute-0 nova_compute[192903]: 2025-10-06 14:20:09.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:09 compute-0 nova_compute[192903]: 2025-10-06 14:20:09.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:20:09 compute-0 nova_compute[192903]: 2025-10-06 14:20:09.831 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:10 compute-0 nova_compute[192903]: 2025-10-06 14:20:10.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:20:10 compute-0 nova_compute[192903]: 2025-10-06 14:20:10.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:11 compute-0 podman[224253]: 2025-10-06 14:20:11.201222753 +0000 UTC m=+0.061638260 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 06 14:20:11 compute-0 podman[224254]: 2025-10-06 14:20:11.208796042 +0000 UTC m=+0.062224385 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:20:11 compute-0 podman[224259]: 2025-10-06 14:20:11.250126847 +0000 UTC m=+0.095049087 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:20:11 compute-0 podman[224252]: 2025-10-06 14:20:11.25063172 +0000 UTC m=+0.107530154 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Oct 06 14:20:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:11.385 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:11.385 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:11.386 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:14 compute-0 nova_compute[192903]: 2025-10-06 14:20:14.028 2 DEBUG nova.compute.manager [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8lulxppg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='06dbc1ff-6c73-45e4-8c11-028140e14fb0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:20:14 compute-0 nova_compute[192903]: 2025-10-06 14:20:14.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:15 compute-0 nova_compute[192903]: 2025-10-06 14:20:15.045 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-06dbc1ff-6c73-45e4-8c11-028140e14fb0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:20:15 compute-0 nova_compute[192903]: 2025-10-06 14:20:15.046 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-06dbc1ff-6c73-45e4-8c11-028140e14fb0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:20:15 compute-0 nova_compute[192903]: 2025-10-06 14:20:15.046 2 DEBUG nova.network.neutron [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:20:15 compute-0 nova_compute[192903]: 2025-10-06 14:20:15.554 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:15 compute-0 nova_compute[192903]: 2025-10-06 14:20:15.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:16 compute-0 nova_compute[192903]: 2025-10-06 14:20:16.520 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:16 compute-0 nova_compute[192903]: 2025-10-06 14:20:16.656 2 DEBUG nova.network.neutron [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Updating instance_info_cache with network_info: [{"id": "666034ac-721c-4563-80ed-acca941f3d6b", "address": "fa:16:3e:16:ad:18", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap666034ac-72", "ovs_interfaceid": "666034ac-721c-4563-80ed-acca941f3d6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.168 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-06dbc1ff-6c73-45e4-8c11-028140e14fb0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.186 2 DEBUG nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8lulxppg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='06dbc1ff-6c73-45e4-8c11-028140e14fb0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.187 2 DEBUG nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Creating instance directory: /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.188 2 DEBUG nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Creating disk.info with the contents: {'/var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk': 'qcow2', '/var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.189 2 DEBUG nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.189 2 DEBUG nova.objects.instance [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid 06dbc1ff-6c73-45e4-8c11-028140e14fb0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.696 2 DEBUG oslo_utils.imageutils.format_inspector [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.704 2 DEBUG oslo_utils.imageutils.format_inspector [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.706 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.796 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.797 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.798 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.799 2 DEBUG oslo_utils.imageutils.format_inspector [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.807 2 DEBUG oslo_utils.imageutils.format_inspector [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.808 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.890 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.891 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.928 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.929 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:17 compute-0 nova_compute[192903]: 2025-10-06 14:20:17.929 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.000 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.001 2 DEBUG nova.virt.disk.api [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.002 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.068 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.069 2 DEBUG nova.virt.disk.api [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.069 2 DEBUG nova.objects.instance [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid 06dbc1ff-6c73-45e4-8c11-028140e14fb0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.578 2 DEBUG nova.objects.base [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<06dbc1ff-6c73-45e4-8c11-028140e14fb0> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.579 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.615 2 DEBUG oslo_concurrency.processutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0/disk.config 497664" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.616 2 DEBUG nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.617 2 DEBUG nova.virt.libvirt.vif [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:19:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1486581317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1486581317',id=20,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:19:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-mz6aiy6c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:19:34Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=06dbc1ff-6c73-45e4-8c11-028140e14fb0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "666034ac-721c-4563-80ed-acca941f3d6b", "address": "fa:16:3e:16:ad:18", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap666034ac-72", "ovs_interfaceid": "666034ac-721c-4563-80ed-acca941f3d6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.618 2 DEBUG nova.network.os_vif_util [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "666034ac-721c-4563-80ed-acca941f3d6b", "address": "fa:16:3e:16:ad:18", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap666034ac-72", "ovs_interfaceid": "666034ac-721c-4563-80ed-acca941f3d6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.618 2 DEBUG nova.network.os_vif_util [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:ad:18,bridge_name='br-int',has_traffic_filtering=True,id=666034ac-721c-4563-80ed-acca941f3d6b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap666034ac-72') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.619 2 DEBUG os_vif [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:ad:18,bridge_name='br-int',has_traffic_filtering=True,id=666034ac-721c-4563-80ed-acca941f3d6b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap666034ac-72') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.620 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.620 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.622 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'dcf71a90-ac34-523c-946a-bb300280f0d7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.626 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap666034ac-72, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.627 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap666034ac-72, col_values=(('qos', UUID('d48b0383-2da4-4f0d-aa93-0cb13dd4e9c6')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.627 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap666034ac-72, col_values=(('external_ids', {'iface-id': '666034ac-721c-4563-80ed-acca941f3d6b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:16:ad:18', 'vm-uuid': '06dbc1ff-6c73-45e4-8c11-028140e14fb0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:18 compute-0 NetworkManager[52035]: <info>  [1759760418.6295] manager: (tap666034ac-72): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.636 2 INFO os_vif [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:ad:18,bridge_name='br-int',has_traffic_filtering=True,id=666034ac-721c-4563-80ed-acca941f3d6b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap666034ac-72')
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.637 2 DEBUG nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.637 2 DEBUG nova.compute.manager [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8lulxppg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='06dbc1ff-6c73-45e4-8c11-028140e14fb0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:20:18 compute-0 nova_compute[192903]: 2025-10-06 14:20:18.638 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:19 compute-0 nova_compute[192903]: 2025-10-06 14:20:19.193 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:19 compute-0 nova_compute[192903]: 2025-10-06 14:20:19.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:19.507 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:20:19 compute-0 nova_compute[192903]: 2025-10-06 14:20:19.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:19.508 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:20:20 compute-0 nova_compute[192903]: 2025-10-06 14:20:20.327 2 DEBUG nova.network.neutron [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Port 666034ac-721c-4563-80ed-acca941f3d6b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:20:20 compute-0 nova_compute[192903]: 2025-10-06 14:20:20.341 2 DEBUG nova.compute.manager [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8lulxppg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='06dbc1ff-6c73-45e4-8c11-028140e14fb0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:20:23 compute-0 podman[224357]: 2025-10-06 14:20:23.195111481 +0000 UTC m=+0.058695372 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20250930)
Oct 06 14:20:23 compute-0 kernel: tap666034ac-72: entered promiscuous mode
Oct 06 14:20:23 compute-0 NetworkManager[52035]: <info>  [1759760423.2997] manager: (tap666034ac-72): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Oct 06 14:20:23 compute-0 ovn_controller[95205]: 2025-10-06T14:20:23Z|00195|binding|INFO|Claiming lport 666034ac-721c-4563-80ed-acca941f3d6b for this additional chassis.
Oct 06 14:20:23 compute-0 nova_compute[192903]: 2025-10-06 14:20:23.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:23 compute-0 ovn_controller[95205]: 2025-10-06T14:20:23Z|00196|binding|INFO|666034ac-721c-4563-80ed-acca941f3d6b: Claiming fa:16:3e:16:ad:18 10.100.0.8
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.313 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:ad:18 10.100.0.8'], port_security=['fa:16:3e:16:ad:18 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '06dbc1ff-6c73-45e4-8c11-028140e14fb0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=666034ac-721c-4563-80ed-acca941f3d6b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.314 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 666034ac-721c-4563-80ed-acca941f3d6b in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.317 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:20:23 compute-0 ovn_controller[95205]: 2025-10-06T14:20:23Z|00197|binding|INFO|Setting lport 666034ac-721c-4563-80ed-acca941f3d6b ovn-installed in OVS
Oct 06 14:20:23 compute-0 nova_compute[192903]: 2025-10-06 14:20:23.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:23 compute-0 nova_compute[192903]: 2025-10-06 14:20:23.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.343 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bf10de8d-d72a-4b62-b5a5-006715b7c261]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:23 compute-0 systemd-machined[152985]: New machine qemu-17-instance-00000014.
Oct 06 14:20:23 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000014.
Oct 06 14:20:23 compute-0 systemd-udevd[224394]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:20:23 compute-0 NetworkManager[52035]: <info>  [1759760423.3844] device (tap666034ac-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:20:23 compute-0 NetworkManager[52035]: <info>  [1759760423.3860] device (tap666034ac-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.392 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[98ed1969-fc63-4662-83b4-53345bb82abe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.395 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[55c954b9-6afd-4cc2-b8f1-b04602a5d00b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.440 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[c813097e-7938-4ab9-b823-cc2bc1873a6a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.464 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[fa06a847-c936-4244-b3ed-db2841a20e75]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485531, 'reachable_time': 29490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224406, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.482 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5cb926fc-06d2-484d-aa8b-706901cf0e88]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485544, 'tstamp': 485544}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224407, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485549, 'tstamp': 485549}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224407, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.483 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:23 compute-0 nova_compute[192903]: 2025-10-06 14:20:23.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:23 compute-0 nova_compute[192903]: 2025-10-06 14:20:23.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.488 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.488 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.488 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.489 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:20:23 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:23.491 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[798ac267-261f-49b1-951c-1b47e5718386]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:23 compute-0 nova_compute[192903]: 2025-10-06 14:20:23.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:24 compute-0 nova_compute[192903]: 2025-10-06 14:20:24.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:24 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:24.511 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:27 compute-0 ovn_controller[95205]: 2025-10-06T14:20:27Z|00198|binding|INFO|Claiming lport 666034ac-721c-4563-80ed-acca941f3d6b for this chassis.
Oct 06 14:20:27 compute-0 ovn_controller[95205]: 2025-10-06T14:20:27Z|00199|binding|INFO|666034ac-721c-4563-80ed-acca941f3d6b: Claiming fa:16:3e:16:ad:18 10.100.0.8
Oct 06 14:20:27 compute-0 ovn_controller[95205]: 2025-10-06T14:20:27Z|00200|binding|INFO|Setting lport 666034ac-721c-4563-80ed-acca941f3d6b up in Southbound
Oct 06 14:20:27 compute-0 podman[224428]: 2025-10-06 14:20:27.233729275 +0000 UTC m=+0.090063406 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Oct 06 14:20:28 compute-0 nova_compute[192903]: 2025-10-06 14:20:28.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:29 compute-0 nova_compute[192903]: 2025-10-06 14:20:29.307 2 INFO nova.compute.manager [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Post operation of migration started
Oct 06 14:20:29 compute-0 nova_compute[192903]: 2025-10-06 14:20:29.308 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:29 compute-0 nova_compute[192903]: 2025-10-06 14:20:29.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:29 compute-0 podman[203308]: time="2025-10-06T14:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:20:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:20:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3472 "" "Go-http-client/1.1"
Oct 06 14:20:30 compute-0 nova_compute[192903]: 2025-10-06 14:20:30.216 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:30 compute-0 nova_compute[192903]: 2025-10-06 14:20:30.217 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:30 compute-0 nova_compute[192903]: 2025-10-06 14:20:30.314 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-06dbc1ff-6c73-45e4-8c11-028140e14fb0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:20:30 compute-0 nova_compute[192903]: 2025-10-06 14:20:30.314 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-06dbc1ff-6c73-45e4-8c11-028140e14fb0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:20:30 compute-0 nova_compute[192903]: 2025-10-06 14:20:30.315 2 DEBUG nova.network.neutron [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:20:30 compute-0 nova_compute[192903]: 2025-10-06 14:20:30.823 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:31 compute-0 openstack_network_exporter[205500]: ERROR   14:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:20:31 compute-0 openstack_network_exporter[205500]: ERROR   14:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:20:31 compute-0 openstack_network_exporter[205500]: ERROR   14:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:20:31 compute-0 openstack_network_exporter[205500]: ERROR   14:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:20:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:20:31 compute-0 openstack_network_exporter[205500]: ERROR   14:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:20:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:20:31 compute-0 nova_compute[192903]: 2025-10-06 14:20:31.493 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:31 compute-0 nova_compute[192903]: 2025-10-06 14:20:31.676 2 DEBUG nova.network.neutron [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Updating instance_info_cache with network_info: [{"id": "666034ac-721c-4563-80ed-acca941f3d6b", "address": "fa:16:3e:16:ad:18", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap666034ac-72", "ovs_interfaceid": "666034ac-721c-4563-80ed-acca941f3d6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:20:32 compute-0 nova_compute[192903]: 2025-10-06 14:20:32.183 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-06dbc1ff-6c73-45e4-8c11-028140e14fb0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:20:32 compute-0 nova_compute[192903]: 2025-10-06 14:20:32.703 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:32 compute-0 nova_compute[192903]: 2025-10-06 14:20:32.704 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:32 compute-0 nova_compute[192903]: 2025-10-06 14:20:32.704 2 DEBUG oslo_concurrency.lockutils [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:32 compute-0 nova_compute[192903]: 2025-10-06 14:20:32.709 2 INFO nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:20:32 compute-0 virtqemud[192802]: Domain id=17 name='instance-00000014' uuid=06dbc1ff-6c73-45e4-8c11-028140e14fb0 is tainted: custom-monitor
Oct 06 14:20:33 compute-0 nova_compute[192903]: 2025-10-06 14:20:33.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:33 compute-0 nova_compute[192903]: 2025-10-06 14:20:33.720 2 INFO nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:20:34 compute-0 nova_compute[192903]: 2025-10-06 14:20:34.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:34 compute-0 nova_compute[192903]: 2025-10-06 14:20:34.728 2 INFO nova.virt.libvirt.driver [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:20:34 compute-0 nova_compute[192903]: 2025-10-06 14:20:34.735 2 DEBUG nova.compute.manager [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:20:35 compute-0 nova_compute[192903]: 2025-10-06 14:20:35.247 2 DEBUG nova.objects.instance [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:20:36 compute-0 nova_compute[192903]: 2025-10-06 14:20:36.267 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:36 compute-0 nova_compute[192903]: 2025-10-06 14:20:36.340 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:36 compute-0 nova_compute[192903]: 2025-10-06 14:20:36.341 2 WARNING neutronclient.v2_0.client [None req-772f01ba-2631-4b6c-b124-73e91bf92859 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.030 2 DEBUG oslo_concurrency.lockutils [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "0764cef7-e2fc-48c0-af26-f628def27fb4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.031 2 DEBUG oslo_concurrency.lockutils [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.031 2 DEBUG oslo_concurrency.lockutils [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.032 2 DEBUG oslo_concurrency.lockutils [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.032 2 DEBUG oslo_concurrency.lockutils [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.049 2 INFO nova.compute.manager [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Terminating instance
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.571 2 DEBUG nova.compute.manager [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:20:38 compute-0 kernel: tap45e2333d-26 (unregistering): left promiscuous mode
Oct 06 14:20:38 compute-0 NetworkManager[52035]: <info>  [1759760438.6010] device (tap45e2333d-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:20:38 compute-0 ovn_controller[95205]: 2025-10-06T14:20:38Z|00201|binding|INFO|Releasing lport 45e2333d-26ca-45dd-947d-99a29d059183 from this chassis (sb_readonly=0)
Oct 06 14:20:38 compute-0 ovn_controller[95205]: 2025-10-06T14:20:38Z|00202|binding|INFO|Setting lport 45e2333d-26ca-45dd-947d-99a29d059183 down in Southbound
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:38 compute-0 ovn_controller[95205]: 2025-10-06T14:20:38Z|00203|binding|INFO|Removing iface tap45e2333d-26 ovn-installed in OVS
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.624 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:5b:96 10.100.0.7'], port_security=['fa:16:3e:b1:5b:96 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0764cef7-e2fc-48c0-af26-f628def27fb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=45e2333d-26ca-45dd-947d-99a29d059183) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.626 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 45e2333d-26ca-45dd-947d-99a29d059183 in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.629 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.651 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[97c9b72f-0d7d-447a-9c60-ccc4f0bc1dbd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:38 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Deactivated successfully.
Oct 06 14:20:38 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Consumed 14.253s CPU time.
Oct 06 14:20:38 compute-0 systemd-machined[152985]: Machine qemu-16-instance-00000015 terminated.
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.700 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bb7a94-2e15-4eff-9449-2b43a9722e13]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.705 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[d0158036-7ffd-4c46-aa2e-2f72a58a2216]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.748 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad00a91-4648-4df7-ac70-6b9cb5336a0d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.775 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[0141141d-4bfd-4032-8ccc-f8435664e823]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485531, 'reachable_time': 29490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224462, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.801 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[606f6d04-f10b-47da-b2bd-10aa8ad5a22a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485544, 'tstamp': 485544}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224464, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485549, 'tstamp': 485549}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224464, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.803 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.811 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.812 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.812 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.812 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:20:38 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:38.814 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c10d04-37fb-4c2f-8917-79fbfa6fabb5]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.858 2 INFO nova.virt.libvirt.driver [-] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Instance destroyed successfully.
Oct 06 14:20:38 compute-0 nova_compute[192903]: 2025-10-06 14:20:38.861 2 DEBUG nova.objects.instance [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'resources' on Instance uuid 0764cef7-e2fc-48c0-af26-f628def27fb4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.323 2 DEBUG nova.compute.manager [req-823a300f-df31-4ac8-9aae-12747abcb36b req-cffaaebf-61ee-4a2f-9a2d-6c6aad64e5a8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Received event network-vif-unplugged-45e2333d-26ca-45dd-947d-99a29d059183 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.324 2 DEBUG oslo_concurrency.lockutils [req-823a300f-df31-4ac8-9aae-12747abcb36b req-cffaaebf-61ee-4a2f-9a2d-6c6aad64e5a8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.324 2 DEBUG oslo_concurrency.lockutils [req-823a300f-df31-4ac8-9aae-12747abcb36b req-cffaaebf-61ee-4a2f-9a2d-6c6aad64e5a8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.325 2 DEBUG oslo_concurrency.lockutils [req-823a300f-df31-4ac8-9aae-12747abcb36b req-cffaaebf-61ee-4a2f-9a2d-6c6aad64e5a8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.325 2 DEBUG nova.compute.manager [req-823a300f-df31-4ac8-9aae-12747abcb36b req-cffaaebf-61ee-4a2f-9a2d-6c6aad64e5a8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] No waiting events found dispatching network-vif-unplugged-45e2333d-26ca-45dd-947d-99a29d059183 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.326 2 DEBUG nova.compute.manager [req-823a300f-df31-4ac8-9aae-12747abcb36b req-cffaaebf-61ee-4a2f-9a2d-6c6aad64e5a8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Received event network-vif-unplugged-45e2333d-26ca-45dd-947d-99a29d059183 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.367 2 DEBUG nova.virt.libvirt.vif [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-512152231',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-512152231',id=21,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:19:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-d53597g0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:19:55Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=0764cef7-e2fc-48c0-af26-f628def27fb4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.368 2 DEBUG nova.network.os_vif_util [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "45e2333d-26ca-45dd-947d-99a29d059183", "address": "fa:16:3e:b1:5b:96", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45e2333d-26", "ovs_interfaceid": "45e2333d-26ca-45dd-947d-99a29d059183", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.369 2 DEBUG nova.network.os_vif_util [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:5b:96,bridge_name='br-int',has_traffic_filtering=True,id=45e2333d-26ca-45dd-947d-99a29d059183,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45e2333d-26') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.370 2 DEBUG os_vif [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:5b:96,bridge_name='br-int',has_traffic_filtering=True,id=45e2333d-26ca-45dd-947d-99a29d059183,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45e2333d-26') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.372 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45e2333d-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.379 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e7e205a1-8d7b-4eb8-bd8a-6adc84fb38d1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.385 2 INFO os_vif [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:5b:96,bridge_name='br-int',has_traffic_filtering=True,id=45e2333d-26ca-45dd-947d-99a29d059183,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45e2333d-26')
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.386 2 INFO nova.virt.libvirt.driver [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Deleting instance files /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4_del
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.387 2 INFO nova.virt.libvirt.driver [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Deletion of /var/lib/nova/instances/0764cef7-e2fc-48c0-af26-f628def27fb4_del complete
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.901 2 INFO nova.compute.manager [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.902 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.902 2 DEBUG nova.compute.manager [-] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.903 2 DEBUG nova.network.neutron [-] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:20:39 compute-0 nova_compute[192903]: 2025-10-06 14:20:39.903 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:40 compute-0 nova_compute[192903]: 2025-10-06 14:20:40.192 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:40 compute-0 nova_compute[192903]: 2025-10-06 14:20:40.513 2 DEBUG nova.compute.manager [req-c3c72754-3b52-4367-bf10-c7023c1d9c3a req-4087326a-e922-46d1-af1d-693919d8b92a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Received event network-vif-deleted-45e2333d-26ca-45dd-947d-99a29d059183 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:20:40 compute-0 nova_compute[192903]: 2025-10-06 14:20:40.514 2 INFO nova.compute.manager [req-c3c72754-3b52-4367-bf10-c7023c1d9c3a req-4087326a-e922-46d1-af1d-693919d8b92a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Neutron deleted interface 45e2333d-26ca-45dd-947d-99a29d059183; detaching it from the instance and deleting it from the info cache
Oct 06 14:20:40 compute-0 nova_compute[192903]: 2025-10-06 14:20:40.514 2 DEBUG nova.network.neutron [req-c3c72754-3b52-4367-bf10-c7023c1d9c3a req-4087326a-e922-46d1-af1d-693919d8b92a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:20:40 compute-0 nova_compute[192903]: 2025-10-06 14:20:40.941 2 DEBUG nova.network.neutron [-] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.021 2 DEBUG nova.compute.manager [req-c3c72754-3b52-4367-bf10-c7023c1d9c3a req-4087326a-e922-46d1-af1d-693919d8b92a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Detach interface failed, port_id=45e2333d-26ca-45dd-947d-99a29d059183, reason: Instance 0764cef7-e2fc-48c0-af26-f628def27fb4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.379 2 DEBUG nova.compute.manager [req-b635f180-40a8-4571-b76f-4fc50db45409 req-8aa67673-9cf0-4bc4-a1c7-0de8417b5ed9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Received event network-vif-unplugged-45e2333d-26ca-45dd-947d-99a29d059183 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.380 2 DEBUG oslo_concurrency.lockutils [req-b635f180-40a8-4571-b76f-4fc50db45409 req-8aa67673-9cf0-4bc4-a1c7-0de8417b5ed9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.381 2 DEBUG oslo_concurrency.lockutils [req-b635f180-40a8-4571-b76f-4fc50db45409 req-8aa67673-9cf0-4bc4-a1c7-0de8417b5ed9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.381 2 DEBUG oslo_concurrency.lockutils [req-b635f180-40a8-4571-b76f-4fc50db45409 req-8aa67673-9cf0-4bc4-a1c7-0de8417b5ed9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.381 2 DEBUG nova.compute.manager [req-b635f180-40a8-4571-b76f-4fc50db45409 req-8aa67673-9cf0-4bc4-a1c7-0de8417b5ed9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] No waiting events found dispatching network-vif-unplugged-45e2333d-26ca-45dd-947d-99a29d059183 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.382 2 DEBUG nova.compute.manager [req-b635f180-40a8-4571-b76f-4fc50db45409 req-8aa67673-9cf0-4bc4-a1c7-0de8417b5ed9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Received event network-vif-unplugged-45e2333d-26ca-45dd-947d-99a29d059183 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.449 2 INFO nova.compute.manager [-] [instance: 0764cef7-e2fc-48c0-af26-f628def27fb4] Took 1.55 seconds to deallocate network for instance.
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.971 2 DEBUG oslo_concurrency.lockutils [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:41 compute-0 nova_compute[192903]: 2025-10-06 14:20:41.972 2 DEBUG oslo_concurrency.lockutils [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:42 compute-0 nova_compute[192903]: 2025-10-06 14:20:42.036 2 DEBUG nova.compute.provider_tree [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:20:42 compute-0 podman[224485]: 2025-10-06 14:20:42.249263863 +0000 UTC m=+0.082922828 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:20:42 compute-0 podman[224484]: 2025-10-06 14:20:42.259160143 +0000 UTC m=+0.095516119 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 14:20:42 compute-0 podman[224483]: 2025-10-06 14:20:42.310493821 +0000 UTC m=+0.153349228 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:20:42 compute-0 podman[224482]: 2025-10-06 14:20:42.3249188 +0000 UTC m=+0.170288063 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 06 14:20:42 compute-0 nova_compute[192903]: 2025-10-06 14:20:42.544 2 DEBUG nova.scheduler.client.report [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:20:43 compute-0 nova_compute[192903]: 2025-10-06 14:20:43.057 2 DEBUG oslo_concurrency.lockutils [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.085s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:43 compute-0 nova_compute[192903]: 2025-10-06 14:20:43.077 2 INFO nova.scheduler.client.report [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Deleted allocations for instance 0764cef7-e2fc-48c0-af26-f628def27fb4
Oct 06 14:20:44 compute-0 nova_compute[192903]: 2025-10-06 14:20:44.111 2 DEBUG oslo_concurrency.lockutils [None req-d1a8c699-0a9b-4294-b7a7-b5718944669b 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "0764cef7-e2fc-48c0-af26-f628def27fb4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.080s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:44 compute-0 nova_compute[192903]: 2025-10-06 14:20:44.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:44 compute-0 nova_compute[192903]: 2025-10-06 14:20:44.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.013 2 DEBUG oslo_concurrency.lockutils [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.014 2 DEBUG oslo_concurrency.lockutils [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.014 2 DEBUG oslo_concurrency.lockutils [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.014 2 DEBUG oslo_concurrency.lockutils [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.014 2 DEBUG oslo_concurrency.lockutils [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.026 2 INFO nova.compute.manager [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Terminating instance
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.542 2 DEBUG nova.compute.manager [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:20:46 compute-0 kernel: tap666034ac-72 (unregistering): left promiscuous mode
Oct 06 14:20:46 compute-0 NetworkManager[52035]: <info>  [1759760446.5761] device (tap666034ac-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:20:46 compute-0 ovn_controller[95205]: 2025-10-06T14:20:46Z|00204|binding|INFO|Releasing lport 666034ac-721c-4563-80ed-acca941f3d6b from this chassis (sb_readonly=0)
Oct 06 14:20:46 compute-0 ovn_controller[95205]: 2025-10-06T14:20:46Z|00205|binding|INFO|Setting lport 666034ac-721c-4563-80ed-acca941f3d6b down in Southbound
Oct 06 14:20:46 compute-0 ovn_controller[95205]: 2025-10-06T14:20:46Z|00206|binding|INFO|Removing iface tap666034ac-72 ovn-installed in OVS
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.599 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:ad:18 10.100.0.8'], port_security=['fa:16:3e:16:ad:18 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '06dbc1ff-6c73-45e4-8c11-028140e14fb0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=666034ac-721c-4563-80ed-acca941f3d6b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.601 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 666034ac-721c-4563-80ed-acca941f3d6b in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.603 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.604 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[00dd1a9a-9f75-467d-837c-79f7ff28d525]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.605 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace which is not needed anymore
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:46 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000014.scope: Deactivated successfully.
Oct 06 14:20:46 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000014.scope: Consumed 2.790s CPU time.
Oct 06 14:20:46 compute-0 systemd-machined[152985]: Machine qemu-17-instance-00000014 terminated.
Oct 06 14:20:46 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224190]: [NOTICE]   (224194) : haproxy version is 3.0.5-8e879a5
Oct 06 14:20:46 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224190]: [NOTICE]   (224194) : path to executable is /usr/sbin/haproxy
Oct 06 14:20:46 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224190]: [WARNING]  (224194) : Exiting Master process...
Oct 06 14:20:46 compute-0 podman[224592]: 2025-10-06 14:20:46.797277964 +0000 UTC m=+0.042876457 container kill 2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Oct 06 14:20:46 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224190]: [ALERT]    (224194) : Current worker (224196) exited with code 143 (Terminated)
Oct 06 14:20:46 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224190]: [WARNING]  (224194) : All workers exited. Exiting... (0)
Oct 06 14:20:46 compute-0 systemd[1]: libpod-2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2.scope: Deactivated successfully.
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.817 2 INFO nova.virt.libvirt.driver [-] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Instance destroyed successfully.
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.817 2 DEBUG nova.objects.instance [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'resources' on Instance uuid 06dbc1ff-6c73-45e4-8c11-028140e14fb0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:20:46 compute-0 podman[224626]: 2025-10-06 14:20:46.858868541 +0000 UTC m=+0.026986609 container died 2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 06 14:20:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2-userdata-shm.mount: Deactivated successfully.
Oct 06 14:20:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-4fd094fff50763222635f4ddbcdf442ac2b516996844745332f2b7bf7ad1f647-merged.mount: Deactivated successfully.
Oct 06 14:20:46 compute-0 podman[224626]: 2025-10-06 14:20:46.91517259 +0000 UTC m=+0.083290558 container remove 2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.923 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[752a93b3-b64b-47ec-879a-ad1c3ad1cbe6]: (4, ("Mon Oct  6 02:20:46 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2)\n2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2\nMon Oct  6 02:20:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2)\n2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:46 compute-0 systemd[1]: libpod-conmon-2fc82151058ba9e95822d27ae7ccf8c1386600aa262dc9c9ba67efbf3f3d4aa2.scope: Deactivated successfully.
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.925 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7f8f942f-c10a-4a7e-8e24-a9288d924a00]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.925 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.926 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[13c58c3c-4e3d-4094-b3be-0220e1d3d3d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.926 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:46 compute-0 kernel: tap55ccf1b2-d0: left promiscuous mode
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.952 2 DEBUG nova.compute.manager [req-58033175-f9d2-4341-831d-c326fa90e6d1 req-59a5befd-f20e-420a-8509-bbf75c67b16a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Received event network-vif-unplugged-666034ac-721c-4563-80ed-acca941f3d6b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.953 2 DEBUG oslo_concurrency.lockutils [req-58033175-f9d2-4341-831d-c326fa90e6d1 req-59a5befd-f20e-420a-8509-bbf75c67b16a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.953 2 DEBUG oslo_concurrency.lockutils [req-58033175-f9d2-4341-831d-c326fa90e6d1 req-59a5befd-f20e-420a-8509-bbf75c67b16a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.953 2 DEBUG oslo_concurrency.lockutils [req-58033175-f9d2-4341-831d-c326fa90e6d1 req-59a5befd-f20e-420a-8509-bbf75c67b16a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.954 2 DEBUG nova.compute.manager [req-58033175-f9d2-4341-831d-c326fa90e6d1 req-59a5befd-f20e-420a-8509-bbf75c67b16a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] No waiting events found dispatching network-vif-unplugged-666034ac-721c-4563-80ed-acca941f3d6b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.954 2 DEBUG nova.compute.manager [req-58033175-f9d2-4341-831d-c326fa90e6d1 req-59a5befd-f20e-420a-8509-bbf75c67b16a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Received event network-vif-unplugged-666034ac-721c-4563-80ed-acca941f3d6b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:46 compute-0 nova_compute[192903]: 2025-10-06 14:20:46.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:46.960 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc1005c-1fc6-4d71-b7a8-a063631866d1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:47 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:47.005 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[56a47ab2-8ee7-4664-a580-55b5152ec2dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:47 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:47.006 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9731c448-7ea9-4c97-b8f9-a36a5c1470ea]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:47 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:47.028 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d3d6d1-5c20-4394-a204-7dc5cd90ba0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485526, 'reachable_time': 25212, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224657, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:47 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:47.032 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:20:47 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:20:47.032 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[c4d65f54-d488-4807-8a0f-c74468fcf134]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:20:47 compute-0 systemd[1]: run-netns-ovnmeta\x2d55ccf1b2\x2dd24e\x2d4063\x2db15b\x2d60a65227d75e.mount: Deactivated successfully.
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.332 2 DEBUG nova.virt.libvirt.vif [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:19:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1486581317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1486581317',id=20,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:19:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-mz6aiy6c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:20:35Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=06dbc1ff-6c73-45e4-8c11-028140e14fb0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "666034ac-721c-4563-80ed-acca941f3d6b", "address": "fa:16:3e:16:ad:18", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap666034ac-72", "ovs_interfaceid": "666034ac-721c-4563-80ed-acca941f3d6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.332 2 DEBUG nova.network.os_vif_util [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "666034ac-721c-4563-80ed-acca941f3d6b", "address": "fa:16:3e:16:ad:18", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap666034ac-72", "ovs_interfaceid": "666034ac-721c-4563-80ed-acca941f3d6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.333 2 DEBUG nova.network.os_vif_util [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:16:ad:18,bridge_name='br-int',has_traffic_filtering=True,id=666034ac-721c-4563-80ed-acca941f3d6b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap666034ac-72') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.333 2 DEBUG os_vif [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:ad:18,bridge_name='br-int',has_traffic_filtering=True,id=666034ac-721c-4563-80ed-acca941f3d6b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap666034ac-72') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.334 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap666034ac-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.338 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d48b0383-2da4-4f0d-aa93-0cb13dd4e9c6) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.340 2 INFO os_vif [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:ad:18,bridge_name='br-int',has_traffic_filtering=True,id=666034ac-721c-4563-80ed-acca941f3d6b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap666034ac-72')
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.341 2 INFO nova.virt.libvirt.driver [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Deleting instance files /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0_del
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.341 2 INFO nova.virt.libvirt.driver [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Deletion of /var/lib/nova/instances/06dbc1ff-6c73-45e4-8c11-028140e14fb0_del complete
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.854 2 INFO nova.compute.manager [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.854 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.855 2 DEBUG nova.compute.manager [-] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.855 2 DEBUG nova.network.neutron [-] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:20:47 compute-0 nova_compute[192903]: 2025-10-06 14:20:47.855 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:48 compute-0 nova_compute[192903]: 2025-10-06 14:20:48.189 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:20:48 compute-0 nova_compute[192903]: 2025-10-06 14:20:48.552 2 DEBUG nova.compute.manager [req-4cc2d99b-47a9-4eeb-8739-29df5532beac req-1e686643-5bc0-48ed-8c17-4c4eb6605016 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Received event network-vif-deleted-666034ac-721c-4563-80ed-acca941f3d6b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:20:48 compute-0 nova_compute[192903]: 2025-10-06 14:20:48.553 2 INFO nova.compute.manager [req-4cc2d99b-47a9-4eeb-8739-29df5532beac req-1e686643-5bc0-48ed-8c17-4c4eb6605016 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Neutron deleted interface 666034ac-721c-4563-80ed-acca941f3d6b; detaching it from the instance and deleting it from the info cache
Oct 06 14:20:48 compute-0 nova_compute[192903]: 2025-10-06 14:20:48.553 2 DEBUG nova.network.neutron [req-4cc2d99b-47a9-4eeb-8739-29df5532beac req-1e686643-5bc0-48ed-8c17-4c4eb6605016 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:20:48 compute-0 nova_compute[192903]: 2025-10-06 14:20:48.978 2 DEBUG nova.network.neutron [-] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:20:49 compute-0 nova_compute[192903]: 2025-10-06 14:20:49.005 2 DEBUG nova.compute.manager [req-e0d30002-306a-41cc-9d7f-f7ff1a5d2fed req-417a61fc-72cf-4c13-9ee0-ec35dd0f32d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Received event network-vif-unplugged-666034ac-721c-4563-80ed-acca941f3d6b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:20:49 compute-0 nova_compute[192903]: 2025-10-06 14:20:49.005 2 DEBUG oslo_concurrency.lockutils [req-e0d30002-306a-41cc-9d7f-f7ff1a5d2fed req-417a61fc-72cf-4c13-9ee0-ec35dd0f32d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:49 compute-0 nova_compute[192903]: 2025-10-06 14:20:49.006 2 DEBUG oslo_concurrency.lockutils [req-e0d30002-306a-41cc-9d7f-f7ff1a5d2fed req-417a61fc-72cf-4c13-9ee0-ec35dd0f32d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:49 compute-0 nova_compute[192903]: 2025-10-06 14:20:49.006 2 DEBUG oslo_concurrency.lockutils [req-e0d30002-306a-41cc-9d7f-f7ff1a5d2fed req-417a61fc-72cf-4c13-9ee0-ec35dd0f32d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:49 compute-0 nova_compute[192903]: 2025-10-06 14:20:49.007 2 DEBUG nova.compute.manager [req-e0d30002-306a-41cc-9d7f-f7ff1a5d2fed req-417a61fc-72cf-4c13-9ee0-ec35dd0f32d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] No waiting events found dispatching network-vif-unplugged-666034ac-721c-4563-80ed-acca941f3d6b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:20:49 compute-0 nova_compute[192903]: 2025-10-06 14:20:49.007 2 DEBUG nova.compute.manager [req-e0d30002-306a-41cc-9d7f-f7ff1a5d2fed req-417a61fc-72cf-4c13-9ee0-ec35dd0f32d4 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Received event network-vif-unplugged-666034ac-721c-4563-80ed-acca941f3d6b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:20:49 compute-0 nova_compute[192903]: 2025-10-06 14:20:49.061 2 DEBUG nova.compute.manager [req-4cc2d99b-47a9-4eeb-8739-29df5532beac req-1e686643-5bc0-48ed-8c17-4c4eb6605016 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Detach interface failed, port_id=666034ac-721c-4563-80ed-acca941f3d6b, reason: Instance 06dbc1ff-6c73-45e4-8c11-028140e14fb0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:20:49 compute-0 nova_compute[192903]: 2025-10-06 14:20:49.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:49 compute-0 nova_compute[192903]: 2025-10-06 14:20:49.492 2 INFO nova.compute.manager [-] [instance: 06dbc1ff-6c73-45e4-8c11-028140e14fb0] Took 1.64 seconds to deallocate network for instance.
Oct 06 14:20:50 compute-0 nova_compute[192903]: 2025-10-06 14:20:50.015 2 DEBUG oslo_concurrency.lockutils [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:50 compute-0 nova_compute[192903]: 2025-10-06 14:20:50.016 2 DEBUG oslo_concurrency.lockutils [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:50 compute-0 nova_compute[192903]: 2025-10-06 14:20:50.023 2 DEBUG oslo_concurrency.lockutils [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:50 compute-0 nova_compute[192903]: 2025-10-06 14:20:50.055 2 INFO nova.scheduler.client.report [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Deleted allocations for instance 06dbc1ff-6c73-45e4-8c11-028140e14fb0
Oct 06 14:20:51 compute-0 nova_compute[192903]: 2025-10-06 14:20:51.083 2 DEBUG oslo_concurrency.lockutils [None req-247f982b-39fd-48ae-a17a-fb79227dfb30 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "06dbc1ff-6c73-45e4-8c11-028140e14fb0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.070s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:52 compute-0 nova_compute[192903]: 2025-10-06 14:20:52.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:54 compute-0 podman[224659]: 2025-10-06 14:20:54.266635498 +0000 UTC m=+0.120396222 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 06 14:20:54 compute-0 nova_compute[192903]: 2025-10-06 14:20:54.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:55 compute-0 nova_compute[192903]: 2025-10-06 14:20:55.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:20:57 compute-0 nova_compute[192903]: 2025-10-06 14:20:57.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:57 compute-0 nova_compute[192903]: 2025-10-06 14:20:57.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.095 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:20:58 compute-0 podman[224679]: 2025-10-06 14:20:58.236389345 +0000 UTC m=+0.097089351 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git)
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.307 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.308 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.328 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.329 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5857MB free_disk=73.30206298828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.330 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:20:58 compute-0 nova_compute[192903]: 2025-10-06 14:20:58.330 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:20:59 compute-0 nova_compute[192903]: 2025-10-06 14:20:59.368 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:20:59 compute-0 nova_compute[192903]: 2025-10-06 14:20:59.369 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:20:58 up  1:22,  0 user,  load average: 0.39, 0.40, 0.40\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:20:59 compute-0 nova_compute[192903]: 2025-10-06 14:20:59.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:20:59 compute-0 nova_compute[192903]: 2025-10-06 14:20:59.446 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:20:59 compute-0 podman[203308]: time="2025-10-06T14:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:20:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:20:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 06 14:20:59 compute-0 nova_compute[192903]: 2025-10-06 14:20:59.960 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:21:00 compute-0 nova_compute[192903]: 2025-10-06 14:21:00.479 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:21:00 compute-0 nova_compute[192903]: 2025-10-06 14:21:00.480 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.150s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:01 compute-0 openstack_network_exporter[205500]: ERROR   14:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:21:01 compute-0 openstack_network_exporter[205500]: ERROR   14:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:21:01 compute-0 openstack_network_exporter[205500]: ERROR   14:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:21:01 compute-0 openstack_network_exporter[205500]: ERROR   14:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:21:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:21:01 compute-0 openstack_network_exporter[205500]: ERROR   14:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:21:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:21:02 compute-0 nova_compute[192903]: 2025-10-06 14:21:02.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:04 compute-0 nova_compute[192903]: 2025-10-06 14:21:04.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:05 compute-0 nova_compute[192903]: 2025-10-06 14:21:05.476 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:21:05 compute-0 nova_compute[192903]: 2025-10-06 14:21:05.477 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:21:05 compute-0 nova_compute[192903]: 2025-10-06 14:21:05.987 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:21:05 compute-0 nova_compute[192903]: 2025-10-06 14:21:05.988 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:21:05 compute-0 nova_compute[192903]: 2025-10-06 14:21:05.988 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:21:07 compute-0 nova_compute[192903]: 2025-10-06 14:21:07.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:07 compute-0 nova_compute[192903]: 2025-10-06 14:21:07.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:21:09 compute-0 nova_compute[192903]: 2025-10-06 14:21:09.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:10 compute-0 nova_compute[192903]: 2025-10-06 14:21:10.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:21:10 compute-0 nova_compute[192903]: 2025-10-06 14:21:10.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:21:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:11.387 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:21:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:11.387 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:21:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:11.387 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:12 compute-0 nova_compute[192903]: 2025-10-06 14:21:12.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:13 compute-0 podman[224706]: 2025-10-06 14:21:13.255295732 +0000 UTC m=+0.104270728 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:21:13 compute-0 podman[224713]: 2025-10-06 14:21:13.255792095 +0000 UTC m=+0.094019389 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:21:13 compute-0 podman[224712]: 2025-10-06 14:21:13.257115559 +0000 UTC m=+0.101468284 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 06 14:21:13 compute-0 podman[224705]: 2025-10-06 14:21:13.258214198 +0000 UTC m=+0.124702804 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:21:14 compute-0 nova_compute[192903]: 2025-10-06 14:21:14.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:15 compute-0 nova_compute[192903]: 2025-10-06 14:21:15.322 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:21:15 compute-0 nova_compute[192903]: 2025-10-06 14:21:15.322 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:21:15 compute-0 nova_compute[192903]: 2025-10-06 14:21:15.827 2 DEBUG nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:21:16 compute-0 nova_compute[192903]: 2025-10-06 14:21:16.391 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:21:16 compute-0 nova_compute[192903]: 2025-10-06 14:21:16.392 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:21:16 compute-0 nova_compute[192903]: 2025-10-06 14:21:16.397 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:21:16 compute-0 nova_compute[192903]: 2025-10-06 14:21:16.398 2 INFO nova.compute.claims [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:21:17 compute-0 nova_compute[192903]: 2025-10-06 14:21:17.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:17 compute-0 nova_compute[192903]: 2025-10-06 14:21:17.454 2 DEBUG nova.compute.provider_tree [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:21:17 compute-0 nova_compute[192903]: 2025-10-06 14:21:17.964 2 DEBUG nova.scheduler.client.report [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:21:18 compute-0 nova_compute[192903]: 2025-10-06 14:21:18.477 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.085s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:18 compute-0 nova_compute[192903]: 2025-10-06 14:21:18.477 2 DEBUG nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:21:18 compute-0 nova_compute[192903]: 2025-10-06 14:21:18.989 2 DEBUG nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:21:18 compute-0 nova_compute[192903]: 2025-10-06 14:21:18.990 2 DEBUG nova.network.neutron [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:21:18 compute-0 nova_compute[192903]: 2025-10-06 14:21:18.991 2 WARNING neutronclient.v2_0.client [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:21:18 compute-0 nova_compute[192903]: 2025-10-06 14:21:18.992 2 WARNING neutronclient.v2_0.client [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:21:19 compute-0 nova_compute[192903]: 2025-10-06 14:21:19.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:19 compute-0 nova_compute[192903]: 2025-10-06 14:21:19.504 2 INFO nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:21:19 compute-0 nova_compute[192903]: 2025-10-06 14:21:19.753 2 DEBUG nova.network.neutron [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Successfully created port: 16d53745-b5c8-468d-889b-b34847d3dd0e _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:21:20 compute-0 nova_compute[192903]: 2025-10-06 14:21:20.013 2 DEBUG nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:21:20 compute-0 nova_compute[192903]: 2025-10-06 14:21:20.405 2 DEBUG nova.network.neutron [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Successfully updated port: 16d53745-b5c8-468d-889b-b34847d3dd0e _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:21:20 compute-0 nova_compute[192903]: 2025-10-06 14:21:20.455 2 DEBUG nova.compute.manager [req-b99b5422-9aa6-4577-8010-85df1ec8d1bd req-c4ecebd5-8621-4cf1-8cd9-aacd97449bc6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-changed-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:21:20 compute-0 nova_compute[192903]: 2025-10-06 14:21:20.456 2 DEBUG nova.compute.manager [req-b99b5422-9aa6-4577-8010-85df1ec8d1bd req-c4ecebd5-8621-4cf1-8cd9-aacd97449bc6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Refreshing instance network info cache due to event network-changed-16d53745-b5c8-468d-889b-b34847d3dd0e. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:21:20 compute-0 nova_compute[192903]: 2025-10-06 14:21:20.456 2 DEBUG oslo_concurrency.lockutils [req-b99b5422-9aa6-4577-8010-85df1ec8d1bd req-c4ecebd5-8621-4cf1-8cd9-aacd97449bc6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-f29ba44d-139b-42c0-8270-fb9071f47ce0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:21:20 compute-0 nova_compute[192903]: 2025-10-06 14:21:20.456 2 DEBUG oslo_concurrency.lockutils [req-b99b5422-9aa6-4577-8010-85df1ec8d1bd req-c4ecebd5-8621-4cf1-8cd9-aacd97449bc6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-f29ba44d-139b-42c0-8270-fb9071f47ce0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:21:20 compute-0 nova_compute[192903]: 2025-10-06 14:21:20.457 2 DEBUG nova.network.neutron [req-b99b5422-9aa6-4577-8010-85df1ec8d1bd req-c4ecebd5-8621-4cf1-8cd9-aacd97449bc6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Refreshing network info cache for port 16d53745-b5c8-468d-889b-b34847d3dd0e _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:21:20 compute-0 nova_compute[192903]: 2025-10-06 14:21:20.910 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "refresh_cache-f29ba44d-139b-42c0-8270-fb9071f47ce0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:21:20 compute-0 nova_compute[192903]: 2025-10-06 14:21:20.961 2 WARNING neutronclient.v2_0.client [req-b99b5422-9aa6-4577-8010-85df1ec8d1bd req-c4ecebd5-8621-4cf1-8cd9-aacd97449bc6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.032 2 DEBUG nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.034 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.035 2 INFO nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Creating image(s)
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.036 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.036 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.038 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.039 2 DEBUG oslo_utils.imageutils.format_inspector [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.046 2 DEBUG oslo_utils.imageutils.format_inspector [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.048 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.064 2 DEBUG nova.network.neutron [req-b99b5422-9aa6-4577-8010-85df1ec8d1bd req-c4ecebd5-8621-4cf1-8cd9-aacd97449bc6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.140 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.141 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.142 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.143 2 DEBUG oslo_utils.imageutils.format_inspector [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.150 2 DEBUG oslo_utils.imageutils.format_inspector [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.151 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.230 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.231 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.273 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.275 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.276 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.295 2 DEBUG nova.network.neutron [req-b99b5422-9aa6-4577-8010-85df1ec8d1bd req-c4ecebd5-8621-4cf1-8cd9-aacd97449bc6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.366 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.367 2 DEBUG nova.virt.disk.api [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Checking if we can resize image /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.368 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.437 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.439 2 DEBUG nova.virt.disk.api [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Cannot resize image /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.440 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.440 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Ensure instance console log exists: /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.441 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.441 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.442 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.807 2 DEBUG oslo_concurrency.lockutils [req-b99b5422-9aa6-4577-8010-85df1ec8d1bd req-c4ecebd5-8621-4cf1-8cd9-aacd97449bc6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-f29ba44d-139b-42c0-8270-fb9071f47ce0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.808 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquired lock "refresh_cache-f29ba44d-139b-42c0-8270-fb9071f47ce0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:21:21 compute-0 nova_compute[192903]: 2025-10-06 14:21:21.808 2 DEBUG nova.network.neutron [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:21:22 compute-0 nova_compute[192903]: 2025-10-06 14:21:22.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:23 compute-0 nova_compute[192903]: 2025-10-06 14:21:23.217 2 DEBUG nova.network.neutron [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.206 2 WARNING neutronclient.v2_0.client [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.350 2 DEBUG nova.network.neutron [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Updating instance_info_cache with network_info: [{"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.859 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Releasing lock "refresh_cache-f29ba44d-139b-42c0-8270-fb9071f47ce0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.860 2 DEBUG nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Instance network_info: |[{"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.861 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Start _get_guest_xml network_info=[{"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.866 2 WARNING nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.867 2 DEBUG nova.virt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-489099389', uuid='f29ba44d-139b-42c0-8270-fb9071f47ce0'), owner=OwnerMeta(userid='98ee6da236ba42baa0fef11dcb52cbdd', username='tempest-TestExecuteStrategies-1255317741-project-admin', projectid='8f3f3b7d20fc4715811486da569fc0ab', projectname='tempest-TestExecuteStrategies-1255317741'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759760484.8672452) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.872 2 DEBUG nova.virt.libvirt.host [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.873 2 DEBUG nova.virt.libvirt.host [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.876 2 DEBUG nova.virt.libvirt.host [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.876 2 DEBUG nova.virt.libvirt.host [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.877 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.877 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.877 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.877 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.878 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.878 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.878 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.878 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.878 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.878 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.879 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.879 2 DEBUG nova.virt.hardware [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.883 2 DEBUG nova.virt.libvirt.vif [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-489099389',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-489099389',id=23,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-bi03b5yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:21:20Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=f29ba44d-139b-42c0-8270-fb9071f47ce0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.883 2 DEBUG nova.network.os_vif_util [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.883 2 DEBUG nova.network.os_vif_util [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:d4:27,bridge_name='br-int',has_traffic_filtering=True,id=16d53745-b5c8-468d-889b-b34847d3dd0e,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d53745-b5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:21:24 compute-0 nova_compute[192903]: 2025-10-06 14:21:24.884 2 DEBUG nova.objects.instance [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'pci_devices' on Instance uuid f29ba44d-139b-42c0-8270-fb9071f47ce0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:21:25 compute-0 podman[224804]: 2025-10-06 14:21:25.236666771 +0000 UTC m=+0.094451781 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid)
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.393 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <uuid>f29ba44d-139b-42c0-8270-fb9071f47ce0</uuid>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <name>instance-00000017</name>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-489099389</nova:name>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:21:24</nova:creationTime>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:21:25 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:21:25 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         <nova:port uuid="16d53745-b5c8-468d-889b-b34847d3dd0e">
Oct 06 14:21:25 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <system>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <entry name="serial">f29ba44d-139b-42c0-8270-fb9071f47ce0</entry>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <entry name="uuid">f29ba44d-139b-42c0-8270-fb9071f47ce0</entry>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     </system>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <os>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   </os>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <features>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   </features>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.config"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:00:d4:27"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <target dev="tap16d53745-b5"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/console.log" append="off"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <video>
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     </video>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:21:25 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:21:25 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:21:25 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:21:25 compute-0 nova_compute[192903]: </domain>
Oct 06 14:21:25 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.393 2 DEBUG nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Preparing to wait for external event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.394 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.394 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.394 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.395 2 DEBUG nova.virt.libvirt.vif [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-489099389',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-489099389',id=23,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-bi03b5yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:21:20Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=f29ba44d-139b-42c0-8270-fb9071f47ce0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.395 2 DEBUG nova.network.os_vif_util [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.395 2 DEBUG nova.network.os_vif_util [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:d4:27,bridge_name='br-int',has_traffic_filtering=True,id=16d53745-b5c8-468d-889b-b34847d3dd0e,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d53745-b5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.395 2 DEBUG os_vif [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:d4:27,bridge_name='br-int',has_traffic_filtering=True,id=16d53745-b5c8-468d-889b-b34847d3dd0e,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d53745-b5') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.396 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.397 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.397 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'aa9ebcff-3ff8-5601-9725-e0303961f948', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.401 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16d53745-b5, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.402 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap16d53745-b5, col_values=(('qos', UUID('9527fa08-5101-4b90-b777-b9b2dbe0059d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.402 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap16d53745-b5, col_values=(('external_ids', {'iface-id': '16d53745-b5c8-468d-889b-b34847d3dd0e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:d4:27', 'vm-uuid': 'f29ba44d-139b-42c0-8270-fb9071f47ce0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:25 compute-0 NetworkManager[52035]: <info>  [1759760485.4038] manager: (tap16d53745-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:25 compute-0 nova_compute[192903]: 2025-10-06 14:21:25.409 2 INFO os_vif [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:d4:27,bridge_name='br-int',has_traffic_filtering=True,id=16d53745-b5c8-468d-889b-b34847d3dd0e,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d53745-b5')
Oct 06 14:21:26 compute-0 nova_compute[192903]: 2025-10-06 14:21:26.980 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:21:26 compute-0 nova_compute[192903]: 2025-10-06 14:21:26.981 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:21:26 compute-0 nova_compute[192903]: 2025-10-06 14:21:26.981 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] No VIF found with MAC fa:16:3e:00:d4:27, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:21:26 compute-0 nova_compute[192903]: 2025-10-06 14:21:26.982 2 INFO nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Using config drive
Oct 06 14:21:27 compute-0 nova_compute[192903]: 2025-10-06 14:21:27.495 2 WARNING neutronclient.v2_0.client [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.292 2 INFO nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Creating config drive at /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.config
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.297 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp1k1_woub execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.421 2 DEBUG oslo_concurrency.processutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmp1k1_woub" returned: 0 in 0.124s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:21:28 compute-0 kernel: tap16d53745-b5: entered promiscuous mode
Oct 06 14:21:28 compute-0 NetworkManager[52035]: <info>  [1759760488.5274] manager: (tap16d53745-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Oct 06 14:21:28 compute-0 ovn_controller[95205]: 2025-10-06T14:21:28Z|00207|binding|INFO|Claiming lport 16d53745-b5c8-468d-889b-b34847d3dd0e for this chassis.
Oct 06 14:21:28 compute-0 ovn_controller[95205]: 2025-10-06T14:21:28Z|00208|binding|INFO|16d53745-b5c8-468d-889b-b34847d3dd0e: Claiming fa:16:3e:00:d4:27 10.100.0.8
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.538 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:d4:27 10.100.0.8'], port_security=['fa:16:3e:00:d4:27 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f29ba44d-139b-42c0-8270-fb9071f47ce0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=16d53745-b5c8-468d-889b-b34847d3dd0e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.541 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 16d53745-b5c8-468d-889b-b34847d3dd0e in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e bound to our chassis
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.542 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:21:28 compute-0 ovn_controller[95205]: 2025-10-06T14:21:28Z|00209|binding|INFO|Setting lport 16d53745-b5c8-468d-889b-b34847d3dd0e ovn-installed in OVS
Oct 06 14:21:28 compute-0 ovn_controller[95205]: 2025-10-06T14:21:28Z|00210|binding|INFO|Setting lport 16d53745-b5c8-468d-889b-b34847d3dd0e up in Southbound
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.563 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[92505164-b658-4ce9-9ad4-342adc7d396f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.564 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55ccf1b2-d1 in ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.566 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55ccf1b2-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.566 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e82aa547-4bd7-45ba-974b-67e785841960]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.568 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[30ab7c94-f552-4cb2-8413-c69a9ded1e29]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 systemd-udevd[224849]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.587 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[9f4c2443-49f2-4a81-a10c-cfb5113e7d12]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 NetworkManager[52035]: <info>  [1759760488.5901] device (tap16d53745-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:21:28 compute-0 NetworkManager[52035]: <info>  [1759760488.5916] device (tap16d53745-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:21:28 compute-0 systemd-machined[152985]: New machine qemu-18-instance-00000017.
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.599 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8d2f6e66-045c-424f-8218-ee99c91eac77]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000017.
Oct 06 14:21:28 compute-0 podman[224836]: 2025-10-06 14:21:28.635121235 +0000 UTC m=+0.113738298 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350)
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.639 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[f88fb575-7829-464d-8c53-394ae8995a2e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.646 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[86040012-57a0-4c21-bcfa-eebb25ef0302]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 systemd-udevd[224861]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:21:28 compute-0 NetworkManager[52035]: <info>  [1759760488.6472] manager: (tap55ccf1b2-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.693 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[d9914afe-d39a-4cc1-a9a5-d697eaa40a39]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.696 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[21fc947e-8a7f-4ffa-a456-c16f2ea335a3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 NetworkManager[52035]: <info>  [1759760488.7214] device (tap55ccf1b2-d0): carrier: link connected
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.732 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[fd38ce2f-1f04-49c9-a14d-efdbce9242c3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.762 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c48965d5-f206-4b6c-86e3-4442924ae5cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495036, 'reachable_time': 44525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224895, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.784 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a26249cb-c394-4ac8-a1f2-5ff683b49713]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:aab9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 495036, 'tstamp': 495036}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224896, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.801 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7014ff3c-ea8d-49f7-b0dd-b9b7bfc95d13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495036, 'reachable_time': 44525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224897, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.845 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c98eb8-8a33-479a-8902-4ea1cfe1a11c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.940 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[aee6ed1f-0d3b-476d-8bf6-5f63ec2bf784]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.942 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.942 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.943 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:21:28 compute-0 NetworkManager[52035]: <info>  [1759760488.9466] manager: (tap55ccf1b2-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct 06 14:21:28 compute-0 kernel: tap55ccf1b2-d0: entered promiscuous mode
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.955 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:28 compute-0 ovn_controller[95205]: 2025-10-06T14:21:28Z|00211|binding|INFO|Releasing lport 0ee47753-a40c-4a21-a6ed-65093b6727d9 from this chassis (sb_readonly=0)
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.963 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[16972299-3ed0-43e0-9f38-2f94d5d9846e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.964 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.964 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.964 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 55ccf1b2-d24e-4063-b15b-60a65227d75e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.965 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.968 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[76826094-2c69-45fa-850a-5215681c7734]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.969 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.969 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4008dc2b-0e2c-431e-bda0-1f0e7be927b7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.970 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:21:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:21:28.971 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'env', 'PROCESS_TAG=haproxy-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55ccf1b2-d24e-4063-b15b-60a65227d75e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:21:28 compute-0 nova_compute[192903]: 2025-10-06 14:21:28.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.396 2 DEBUG nova.compute.manager [req-18c1839c-9292-4b99-9d2e-a2f688889d84 req-355468ff-273c-49f5-800d-3e2c5496b50f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.397 2 DEBUG oslo_concurrency.lockutils [req-18c1839c-9292-4b99-9d2e-a2f688889d84 req-355468ff-273c-49f5-800d-3e2c5496b50f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.398 2 DEBUG oslo_concurrency.lockutils [req-18c1839c-9292-4b99-9d2e-a2f688889d84 req-355468ff-273c-49f5-800d-3e2c5496b50f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.398 2 DEBUG oslo_concurrency.lockutils [req-18c1839c-9292-4b99-9d2e-a2f688889d84 req-355468ff-273c-49f5-800d-3e2c5496b50f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.399 2 DEBUG nova.compute.manager [req-18c1839c-9292-4b99-9d2e-a2f688889d84 req-355468ff-273c-49f5-800d-3e2c5496b50f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Processing event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.427 2 DEBUG nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:21:29 compute-0 podman[224936]: 2025-10-06 14:21:29.43159887 +0000 UTC m=+0.066289661 container create 99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.434 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.437 2 INFO nova.virt.libvirt.driver [-] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Instance spawned successfully.
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.439 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:29 compute-0 systemd[1]: Started libpod-conmon-99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70.scope.
Oct 06 14:21:29 compute-0 podman[224936]: 2025-10-06 14:21:29.394105366 +0000 UTC m=+0.028796137 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:21:29 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:21:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6bfd7a9b7bd3bf5a393ed8fbed24552dd4cea724ef7bb3ed28d52958bf71d0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:21:29 compute-0 podman[224936]: 2025-10-06 14:21:29.520501315 +0000 UTC m=+0.155192096 container init 99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930)
Oct 06 14:21:29 compute-0 podman[224936]: 2025-10-06 14:21:29.527254042 +0000 UTC m=+0.161944793 container start 99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:21:29 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224951]: [NOTICE]   (224955) : New worker (224957) forked
Oct 06 14:21:29 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224951]: [NOTICE]   (224955) : Loading success.
Oct 06 14:21:29 compute-0 podman[203308]: time="2025-10-06T14:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:21:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:21:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.951 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.952 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.952 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.952 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.953 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:21:29 compute-0 nova_compute[192903]: 2025-10-06 14:21:29.953 2 DEBUG nova.virt.libvirt.driver [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:21:30 compute-0 nova_compute[192903]: 2025-10-06 14:21:30.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:30 compute-0 nova_compute[192903]: 2025-10-06 14:21:30.463 2 INFO nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Took 9.43 seconds to spawn the instance on the hypervisor.
Oct 06 14:21:30 compute-0 nova_compute[192903]: 2025-10-06 14:21:30.463 2 DEBUG nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:21:31 compute-0 nova_compute[192903]: 2025-10-06 14:21:31.009 2 INFO nova.compute.manager [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Took 14.67 seconds to build instance.
Oct 06 14:21:31 compute-0 openstack_network_exporter[205500]: ERROR   14:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:21:31 compute-0 openstack_network_exporter[205500]: ERROR   14:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:21:31 compute-0 openstack_network_exporter[205500]: ERROR   14:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:21:31 compute-0 openstack_network_exporter[205500]: ERROR   14:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:21:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:21:31 compute-0 openstack_network_exporter[205500]: ERROR   14:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:21:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:21:31 compute-0 nova_compute[192903]: 2025-10-06 14:21:31.444 2 DEBUG nova.compute.manager [req-38ec475a-0668-41b2-93ba-199e1a94dc95 req-bff48783-3394-4a9b-bd79-e5ee964a89b6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:21:31 compute-0 nova_compute[192903]: 2025-10-06 14:21:31.444 2 DEBUG oslo_concurrency.lockutils [req-38ec475a-0668-41b2-93ba-199e1a94dc95 req-bff48783-3394-4a9b-bd79-e5ee964a89b6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:21:31 compute-0 nova_compute[192903]: 2025-10-06 14:21:31.445 2 DEBUG oslo_concurrency.lockutils [req-38ec475a-0668-41b2-93ba-199e1a94dc95 req-bff48783-3394-4a9b-bd79-e5ee964a89b6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:21:31 compute-0 nova_compute[192903]: 2025-10-06 14:21:31.445 2 DEBUG oslo_concurrency.lockutils [req-38ec475a-0668-41b2-93ba-199e1a94dc95 req-bff48783-3394-4a9b-bd79-e5ee964a89b6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:31 compute-0 nova_compute[192903]: 2025-10-06 14:21:31.445 2 DEBUG nova.compute.manager [req-38ec475a-0668-41b2-93ba-199e1a94dc95 req-bff48783-3394-4a9b-bd79-e5ee964a89b6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] No waiting events found dispatching network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:21:31 compute-0 nova_compute[192903]: 2025-10-06 14:21:31.446 2 WARNING nova.compute.manager [req-38ec475a-0668-41b2-93ba-199e1a94dc95 req-bff48783-3394-4a9b-bd79-e5ee964a89b6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received unexpected event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e for instance with vm_state active and task_state None.
Oct 06 14:21:31 compute-0 nova_compute[192903]: 2025-10-06 14:21:31.516 2 DEBUG oslo_concurrency.lockutils [None req-5d744447-00d0-4995-b28d-c7d26a5df294 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.194s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:21:34 compute-0 nova_compute[192903]: 2025-10-06 14:21:34.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:35 compute-0 nova_compute[192903]: 2025-10-06 14:21:35.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:39 compute-0 nova_compute[192903]: 2025-10-06 14:21:39.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:40 compute-0 nova_compute[192903]: 2025-10-06 14:21:40.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:41 compute-0 ovn_controller[95205]: 2025-10-06T14:21:41Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:00:d4:27 10.100.0.8
Oct 06 14:21:41 compute-0 ovn_controller[95205]: 2025-10-06T14:21:41Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:00:d4:27 10.100.0.8
Oct 06 14:21:44 compute-0 podman[224981]: 2025-10-06 14:21:44.238686496 +0000 UTC m=+0.075913275 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:21:44 compute-0 podman[224982]: 2025-10-06 14:21:44.264971136 +0000 UTC m=+0.096095585 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:21:44 compute-0 podman[224980]: 2025-10-06 14:21:44.265015147 +0000 UTC m=+0.111585141 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 06 14:21:44 compute-0 podman[224979]: 2025-10-06 14:21:44.281802028 +0000 UTC m=+0.131676979 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Oct 06 14:21:44 compute-0 nova_compute[192903]: 2025-10-06 14:21:44.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:45 compute-0 nova_compute[192903]: 2025-10-06 14:21:45.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:49 compute-0 nova_compute[192903]: 2025-10-06 14:21:49.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:50 compute-0 nova_compute[192903]: 2025-10-06 14:21:50.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:54 compute-0 nova_compute[192903]: 2025-10-06 14:21:54.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:54 compute-0 nova_compute[192903]: 2025-10-06 14:21:54.888 2 DEBUG nova.compute.manager [None req-6cb19039-277e-49fa-a58a-f66f92b0fe12 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Oct 06 14:21:54 compute-0 nova_compute[192903]: 2025-10-06 14:21:54.936 2 DEBUG nova.compute.provider_tree [None req-6cb19039-277e-49fa-a58a-f66f92b0fe12 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Updating resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 generation from 20 to 26 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 06 14:21:55 compute-0 nova_compute[192903]: 2025-10-06 14:21:55.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:56 compute-0 podman[225064]: 2025-10-06 14:21:56.229318868 +0000 UTC m=+0.082842936 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct 06 14:21:57 compute-0 nova_compute[192903]: 2025-10-06 14:21:57.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:21:58 compute-0 ovn_controller[95205]: 2025-10-06T14:21:58Z|00212|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Oct 06 14:21:59 compute-0 podman[225085]: 2025-10-06 14:21:59.20230869 +0000 UTC m=+0.073320177 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 06 14:21:59 compute-0 nova_compute[192903]: 2025-10-06 14:21:59.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:21:59 compute-0 nova_compute[192903]: 2025-10-06 14:21:59.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:21:59 compute-0 podman[203308]: time="2025-10-06T14:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:21:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:21:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3479 "" "Go-http-client/1.1"
Oct 06 14:22:00 compute-0 nova_compute[192903]: 2025-10-06 14:22:00.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:00 compute-0 nova_compute[192903]: 2025-10-06 14:22:00.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:00 compute-0 nova_compute[192903]: 2025-10-06 14:22:00.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:00 compute-0 nova_compute[192903]: 2025-10-06 14:22:00.097 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:22:00 compute-0 nova_compute[192903]: 2025-10-06 14:22:00.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.230 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.307 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.308 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.372 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:22:01 compute-0 openstack_network_exporter[205500]: ERROR   14:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:22:01 compute-0 openstack_network_exporter[205500]: ERROR   14:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:22:01 compute-0 openstack_network_exporter[205500]: ERROR   14:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:22:01 compute-0 openstack_network_exporter[205500]: ERROR   14:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:22:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:22:01 compute-0 openstack_network_exporter[205500]: ERROR   14:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:22:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.541 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.543 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.566 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.567 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5645MB free_disk=73.27301788330078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.568 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:01 compute-0 nova_compute[192903]: 2025-10-06 14:22:01.568 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:03 compute-0 nova_compute[192903]: 2025-10-06 14:22:03.126 2 INFO nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 2c27d4dd-209a-4cae-bfaf-651d66d96457 has allocations against this compute host but is not found in the database.
Oct 06 14:22:03 compute-0 nova_compute[192903]: 2025-10-06 14:22:03.127 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:22:03 compute-0 nova_compute[192903]: 2025-10-06 14:22:03.127 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:22:01 up  1:23,  0 user,  load average: 0.29, 0.36, 0.38\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_8f3f3b7d20fc4715811486da569fc0ab': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:22:03 compute-0 nova_compute[192903]: 2025-10-06 14:22:03.163 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:22:03 compute-0 nova_compute[192903]: 2025-10-06 14:22:03.674 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:22:03 compute-0 nova_compute[192903]: 2025-10-06 14:22:03.907 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Check if temp file /var/lib/nova/instances/tmp3cmewfnz exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Oct 06 14:22:03 compute-0 nova_compute[192903]: 2025-10-06 14:22:03.911 2 DEBUG nova.compute.manager [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3cmewfnz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f29ba44d-139b-42c0-8270-fb9071f47ce0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Oct 06 14:22:04 compute-0 nova_compute[192903]: 2025-10-06 14:22:04.185 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:22:04 compute-0 nova_compute[192903]: 2025-10-06 14:22:04.186 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.617s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:04 compute-0 nova_compute[192903]: 2025-10-06 14:22:04.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:05 compute-0 nova_compute[192903]: 2025-10-06 14:22:05.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.183 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.183 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.184 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.184 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.553 2 DEBUG oslo_concurrency.processutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.623 2 DEBUG oslo_concurrency.processutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.624 2 DEBUG oslo_concurrency.processutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.687 2 DEBUG oslo_concurrency.processutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.690 2 DEBUG nova.compute.manager [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Preparing to wait for external event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.691 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.691 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:08 compute-0 nova_compute[192903]: 2025-10-06 14:22:08.692 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:09 compute-0 nova_compute[192903]: 2025-10-06 14:22:09.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:10 compute-0 nova_compute[192903]: 2025-10-06 14:22:10.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:11.389 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:11.389 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:11.390 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:11 compute-0 nova_compute[192903]: 2025-10-06 14:22:11.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:22:12 compute-0 nova_compute[192903]: 2025-10-06 14:22:12.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:22:14 compute-0 nova_compute[192903]: 2025-10-06 14:22:14.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:14.596 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:22:14 compute-0 nova_compute[192903]: 2025-10-06 14:22:14.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:14.598 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:22:14 compute-0 nova_compute[192903]: 2025-10-06 14:22:14.622 2 DEBUG nova.compute.manager [req-71a61945-147d-4210-ad68-e53b42d9ed45 req-2eee8411-ecf0-4892-9f50-488e0acb8e88 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:22:14 compute-0 nova_compute[192903]: 2025-10-06 14:22:14.623 2 DEBUG oslo_concurrency.lockutils [req-71a61945-147d-4210-ad68-e53b42d9ed45 req-2eee8411-ecf0-4892-9f50-488e0acb8e88 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:14 compute-0 nova_compute[192903]: 2025-10-06 14:22:14.624 2 DEBUG oslo_concurrency.lockutils [req-71a61945-147d-4210-ad68-e53b42d9ed45 req-2eee8411-ecf0-4892-9f50-488e0acb8e88 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:14 compute-0 nova_compute[192903]: 2025-10-06 14:22:14.624 2 DEBUG oslo_concurrency.lockutils [req-71a61945-147d-4210-ad68-e53b42d9ed45 req-2eee8411-ecf0-4892-9f50-488e0acb8e88 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:14 compute-0 nova_compute[192903]: 2025-10-06 14:22:14.625 2 DEBUG nova.compute.manager [req-71a61945-147d-4210-ad68-e53b42d9ed45 req-2eee8411-ecf0-4892-9f50-488e0acb8e88 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] No event matching network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e in dict_keys([('network-vif-plugged', '16d53745-b5c8-468d-889b-b34847d3dd0e')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Oct 06 14:22:14 compute-0 nova_compute[192903]: 2025-10-06 14:22:14.625 2 DEBUG nova.compute.manager [req-71a61945-147d-4210-ad68-e53b42d9ed45 req-2eee8411-ecf0-4892-9f50-488e0acb8e88 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:22:15 compute-0 podman[225124]: 2025-10-06 14:22:15.215777911 +0000 UTC m=+0.072666349 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:22:15 compute-0 podman[225125]: 2025-10-06 14:22:15.23705008 +0000 UTC m=+0.076927071 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:22:15 compute-0 podman[225131]: 2025-10-06 14:22:15.269072881 +0000 UTC m=+0.110440471 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:22:15 compute-0 podman[225123]: 2025-10-06 14:22:15.29493729 +0000 UTC m=+0.157244530 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 14:22:15 compute-0 nova_compute[192903]: 2025-10-06 14:22:15.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:15.600 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:22:15 compute-0 nova_compute[192903]: 2025-10-06 14:22:15.719 2 INFO nova.compute.manager [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Took 7.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.678 2 DEBUG nova.compute.manager [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.679 2 DEBUG oslo_concurrency.lockutils [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.679 2 DEBUG oslo_concurrency.lockutils [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.680 2 DEBUG oslo_concurrency.lockutils [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.680 2 DEBUG nova.compute.manager [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Processing event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.681 2 DEBUG nova.compute.manager [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-changed-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.681 2 DEBUG nova.compute.manager [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Refreshing instance network info cache due to event network-changed-16d53745-b5c8-468d-889b-b34847d3dd0e. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.682 2 DEBUG oslo_concurrency.lockutils [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-f29ba44d-139b-42c0-8270-fb9071f47ce0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.682 2 DEBUG oslo_concurrency.lockutils [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-f29ba44d-139b-42c0-8270-fb9071f47ce0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.683 2 DEBUG nova.network.neutron [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Refreshing network info cache for port 16d53745-b5c8-468d-889b-b34847d3dd0e _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:22:16 compute-0 nova_compute[192903]: 2025-10-06 14:22:16.684 2 DEBUG nova.compute.manager [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:22:17 compute-0 nova_compute[192903]: 2025-10-06 14:22:17.193 2 WARNING neutronclient.v2_0.client [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:22:17 compute-0 nova_compute[192903]: 2025-10-06 14:22:17.202 2 DEBUG nova.compute.manager [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3cmewfnz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f29ba44d-139b-42c0-8270-fb9071f47ce0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(2c27d4dd-209a-4cae-bfaf-651d66d96457),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Oct 06 14:22:17 compute-0 nova_compute[192903]: 2025-10-06 14:22:17.612 2 WARNING neutronclient.v2_0.client [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:22:17 compute-0 nova_compute[192903]: 2025-10-06 14:22:17.721 2 DEBUG nova.objects.instance [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid f29ba44d-139b-42c0-8270-fb9071f47ce0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:22:17 compute-0 nova_compute[192903]: 2025-10-06 14:22:17.723 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Oct 06 14:22:17 compute-0 nova_compute[192903]: 2025-10-06 14:22:17.725 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 06 14:22:17 compute-0 nova_compute[192903]: 2025-10-06 14:22:17.725 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 06 14:22:17 compute-0 nova_compute[192903]: 2025-10-06 14:22:17.748 2 DEBUG nova.network.neutron [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Updated VIF entry in instance network info cache for port 16d53745-b5c8-468d-889b-b34847d3dd0e. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 06 14:22:17 compute-0 nova_compute[192903]: 2025-10-06 14:22:17.749 2 DEBUG nova.network.neutron [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Updating instance_info_cache with network_info: [{"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.228 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.229 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.233 2 DEBUG nova.virt.libvirt.vif [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-489099389',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-489099389',id=23,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:21:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-bi03b5yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:21:30Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=f29ba44d-139b-42c0-8270-fb9071f47ce0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.234 2 DEBUG nova.network.os_vif_util [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.235 2 DEBUG nova.network.os_vif_util [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:d4:27,bridge_name='br-int',has_traffic_filtering=True,id=16d53745-b5c8-468d-889b-b34847d3dd0e,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d53745-b5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.236 2 DEBUG nova.virt.libvirt.migration [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Updating guest XML with vif config: <interface type="ethernet">
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <mac address="fa:16:3e:00:d4:27"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <model type="virtio"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <mtu size="1442"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <target dev="tap16d53745-b5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]: </interface>
Oct 06 14:22:18 compute-0 nova_compute[192903]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.237 2 DEBUG nova.virt.libvirt.migration [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <name>instance-00000017</name>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <uuid>f29ba44d-139b-42c0-8270-fb9071f47ce0</uuid>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-489099389</nova:name>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:21:24</nova:creationTime>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:22:18 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:22:18 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:port uuid="16d53745-b5c8-468d-889b-b34847d3dd0e">
Oct 06 14:22:18 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <memory unit="KiB">131072</memory>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <vcpu placement="static">1</vcpu>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <resource>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <partition>/machine</partition>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </resource>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <system>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="serial">f29ba44d-139b-42c0-8270-fb9071f47ce0</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="uuid">f29ba44d-139b-42c0-8270-fb9071f47ce0</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </system>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <os>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </os>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <features>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <vmcoreinfo state="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </features>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <cpu mode="host-model" check="partial">
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <on_poweroff>destroy</on_poweroff>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <on_reboot>restart</on_reboot>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <on_crash>destroy</on_crash>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.config"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <readonly/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="1" port="0x10"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="2" port="0x11"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="3" port="0x12"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="4" port="0x13"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="5" port="0x14"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="6" port="0x15"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="7" port="0x16"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="8" port="0x17"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="9" port="0x18"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="10" port="0x19"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="11" port="0x1a"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="12" port="0x1b"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="13" port="0x1c"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="14" port="0x1d"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="15" port="0x1e"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="16" port="0x1f"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="17" port="0x20"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="18" port="0x21"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="19" port="0x22"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="20" port="0x23"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="21" port="0x24"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="22" port="0x25"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="23" port="0x26"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="24" port="0x27"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="25" port="0x28"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-pci-bridge"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="sata" index="0">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <interface type="ethernet"><mac address="fa:16:3e:00:d4:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16d53745-b5"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </interface><serial type="pty">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/console.log" append="off"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target type="isa-serial" port="0">
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <model name="isa-serial"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </target>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <console type="pty">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/console.log" append="off"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target type="serial" port="0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </console>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="usb" bus="0" port="1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </input>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <input type="mouse" bus="ps2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <listen type="address" address="::"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </graphics>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <video>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model type="virtio" heads="1" primary="yes"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </video>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]: </domain>
Oct 06 14:22:18 compute-0 nova_compute[192903]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.242 2 DEBUG nova.virt.libvirt.migration [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <name>instance-00000017</name>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <uuid>f29ba44d-139b-42c0-8270-fb9071f47ce0</uuid>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-489099389</nova:name>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:21:24</nova:creationTime>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:22:18 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:22:18 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:port uuid="16d53745-b5c8-468d-889b-b34847d3dd0e">
Oct 06 14:22:18 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <memory unit="KiB">131072</memory>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <vcpu placement="static">1</vcpu>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <resource>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <partition>/machine</partition>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </resource>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <system>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="serial">f29ba44d-139b-42c0-8270-fb9071f47ce0</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="uuid">f29ba44d-139b-42c0-8270-fb9071f47ce0</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </system>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <os>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </os>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <features>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <vmcoreinfo state="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </features>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <cpu mode="host-model" check="partial">
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <on_poweroff>destroy</on_poweroff>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <on_reboot>restart</on_reboot>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <on_crash>destroy</on_crash>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.config"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <readonly/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="1" port="0x10"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="2" port="0x11"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="3" port="0x12"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="4" port="0x13"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="5" port="0x14"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="6" port="0x15"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="7" port="0x16"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="8" port="0x17"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="9" port="0x18"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="10" port="0x19"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="11" port="0x1a"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="12" port="0x1b"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="13" port="0x1c"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="14" port="0x1d"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="15" port="0x1e"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="16" port="0x1f"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="17" port="0x20"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="18" port="0x21"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="19" port="0x22"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="20" port="0x23"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="21" port="0x24"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="22" port="0x25"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="23" port="0x26"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="24" port="0x27"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="25" port="0x28"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-pci-bridge"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="sata" index="0">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <interface type="ethernet"><mac address="fa:16:3e:00:d4:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16d53745-b5"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </interface><serial type="pty">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/console.log" append="off"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target type="isa-serial" port="0">
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <model name="isa-serial"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </target>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <console type="pty">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/console.log" append="off"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target type="serial" port="0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </console>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="usb" bus="0" port="1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </input>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <input type="mouse" bus="ps2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <listen type="address" address="::"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </graphics>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <video>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model type="virtio" heads="1" primary="yes"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </video>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]: </domain>
Oct 06 14:22:18 compute-0 nova_compute[192903]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.242 2 DEBUG nova.virt.libvirt.migration [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] _update_pci_xml output xml=<domain type="kvm">
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <name>instance-00000017</name>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <uuid>f29ba44d-139b-42c0-8270-fb9071f47ce0</uuid>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteStrategies-server-489099389</nova:name>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:21:24</nova:creationTime>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:22:18 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:22:18 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:user uuid="98ee6da236ba42baa0fef11dcb52cbdd">tempest-TestExecuteStrategies-1255317741-project-admin</nova:user>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:project uuid="8f3f3b7d20fc4715811486da569fc0ab">tempest-TestExecuteStrategies-1255317741</nova:project>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <nova:port uuid="16d53745-b5c8-468d-889b-b34847d3dd0e">
Oct 06 14:22:18 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <memory unit="KiB">131072</memory>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <vcpu placement="static">1</vcpu>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <resource>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <partition>/machine</partition>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </resource>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <system>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="serial">f29ba44d-139b-42c0-8270-fb9071f47ce0</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="uuid">f29ba44d-139b-42c0-8270-fb9071f47ce0</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </system>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <os>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </os>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <features>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <vmcoreinfo state="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </features>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <cpu mode="host-model" check="partial">
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <on_poweroff>destroy</on_poweroff>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <on_reboot>restart</on_reboot>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <on_crash>destroy</on_crash>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/disk.config"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <readonly/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="1" port="0x10"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="2" port="0x11"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="3" port="0x12"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="4" port="0x13"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="5" port="0x14"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="6" port="0x15"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="7" port="0x16"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="8" port="0x17"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="9" port="0x18"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="10" port="0x19"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="11" port="0x1a"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="12" port="0x1b"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="13" port="0x1c"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="14" port="0x1d"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="15" port="0x1e"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="16" port="0x1f"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="17" port="0x20"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="18" port="0x21"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="19" port="0x22"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="20" port="0x23"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="21" port="0x24"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="22" port="0x25"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="23" port="0x26"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="24" port="0x27"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-root-port"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target chassis="25" port="0x28"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model name="pcie-pci-bridge"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <controller type="sata" index="0">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </controller>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <interface type="ethernet"><mac address="fa:16:3e:00:d4:27"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap16d53745-b5"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </interface><serial type="pty">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/console.log" append="off"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target type="isa-serial" port="0">
Oct 06 14:22:18 compute-0 nova_compute[192903]:         <model name="isa-serial"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       </target>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <console type="pty">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0/console.log" append="off"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <target type="serial" port="0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </console>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="usb" bus="0" port="1"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </input>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <input type="mouse" bus="ps2"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <listen type="address" address="::"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </graphics>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <video>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <model type="virtio" heads="1" primary="yes"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </video>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:22:18 compute-0 nova_compute[192903]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:22:18 compute-0 nova_compute[192903]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 06 14:22:18 compute-0 nova_compute[192903]: </domain>
Oct 06 14:22:18 compute-0 nova_compute[192903]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.243 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.255 2 DEBUG oslo_concurrency.lockutils [req-3f5608ad-eec7-4a34-b2e3-f23409b63738 req-95c63770-2783-42b4-a94b-417c4d07e5a2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-f29ba44d-139b-42c0-8270-fb9071f47ce0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.732 2 DEBUG nova.virt.libvirt.migration [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 06 14:22:18 compute-0 nova_compute[192903]: 2025-10-06 14:22:18.732 2 INFO nova.virt.libvirt.migration [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 06 14:22:19 compute-0 nova_compute[192903]: 2025-10-06 14:22:19.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:19 compute-0 nova_compute[192903]: 2025-10-06 14:22:19.752 2 INFO nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.270 2 DEBUG nova.virt.libvirt.migration [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.271 2 DEBUG nova.virt.libvirt.migration [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Oct 06 14:22:20 compute-0 kernel: tap16d53745-b5 (unregistering): left promiscuous mode
Oct 06 14:22:20 compute-0 NetworkManager[52035]: <info>  [1759760540.3393] device (tap16d53745-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:20 compute-0 ovn_controller[95205]: 2025-10-06T14:22:20Z|00213|binding|INFO|Releasing lport 16d53745-b5c8-468d-889b-b34847d3dd0e from this chassis (sb_readonly=0)
Oct 06 14:22:20 compute-0 ovn_controller[95205]: 2025-10-06T14:22:20Z|00214|binding|INFO|Setting lport 16d53745-b5c8-468d-889b-b34847d3dd0e down in Southbound
Oct 06 14:22:20 compute-0 ovn_controller[95205]: 2025-10-06T14:22:20Z|00215|binding|INFO|Removing iface tap16d53745-b5 ovn-installed in OVS
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.356 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:d4:27 10.100.0.8'], port_security=['fa:16:3e:00:d4:27 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '7f5c9d61-0a9d-467d-89a7-11e43e674cfc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f29ba44d-139b-42c0-8270-fb9071f47ce0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=16d53745-b5c8-468d-889b-b34847d3dd0e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.358 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 16d53745-b5c8-468d-889b-b34847d3dd0e in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.359 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.361 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3ee23e-f13b-4078-afd2-7feddb7c9c6e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.362 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace which is not needed anymore
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:20 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000017.scope: Deactivated successfully.
Oct 06 14:22:20 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000017.scope: Consumed 14.953s CPU time.
Oct 06 14:22:20 compute-0 systemd-machined[152985]: Machine qemu-18-instance-00000017 terminated.
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.507 2 DEBUG nova.compute.manager [req-6313a414-58d8-46a3-90d3-bd564c00cd13 req-65fa43cf-029a-47c0-8a2f-fe2f93407bf0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.507 2 DEBUG oslo_concurrency.lockutils [req-6313a414-58d8-46a3-90d3-bd564c00cd13 req-65fa43cf-029a-47c0-8a2f-fe2f93407bf0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.507 2 DEBUG oslo_concurrency.lockutils [req-6313a414-58d8-46a3-90d3-bd564c00cd13 req-65fa43cf-029a-47c0-8a2f-fe2f93407bf0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.507 2 DEBUG oslo_concurrency.lockutils [req-6313a414-58d8-46a3-90d3-bd564c00cd13 req-65fa43cf-029a-47c0-8a2f-fe2f93407bf0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.507 2 DEBUG nova.compute.manager [req-6313a414-58d8-46a3-90d3-bd564c00cd13 req-65fa43cf-029a-47c0-8a2f-fe2f93407bf0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] No waiting events found dispatching network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.507 2 DEBUG nova.compute.manager [req-6313a414-58d8-46a3-90d3-bd564c00cd13 req-65fa43cf-029a-47c0-8a2f-fe2f93407bf0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:22:20 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224951]: [NOTICE]   (224955) : haproxy version is 3.0.5-8e879a5
Oct 06 14:22:20 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224951]: [NOTICE]   (224955) : path to executable is /usr/sbin/haproxy
Oct 06 14:22:20 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224951]: [WARNING]  (224955) : Exiting Master process...
Oct 06 14:22:20 compute-0 podman[225241]: 2025-10-06 14:22:20.510150582 +0000 UTC m=+0.040299220 container kill 99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 06 14:22:20 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224951]: [ALERT]    (224955) : Current worker (224957) exited with code 143 (Terminated)
Oct 06 14:22:20 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[224951]: [WARNING]  (224955) : All workers exited. Exiting... (0)
Oct 06 14:22:20 compute-0 systemd[1]: libpod-99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70.scope: Deactivated successfully.
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:20 compute-0 podman[225259]: 2025-10-06 14:22:20.564190501 +0000 UTC m=+0.026078776 container died 99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.576 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.577 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.577 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Oct 06 14:22:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70-userdata-shm.mount: Deactivated successfully.
Oct 06 14:22:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-b6bfd7a9b7bd3bf5a393ed8fbed24552dd4cea724ef7bb3ed28d52958bf71d0f-merged.mount: Deactivated successfully.
Oct 06 14:22:20 compute-0 podman[225259]: 2025-10-06 14:22:20.614563294 +0000 UTC m=+0.076451549 container remove 99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 06 14:22:20 compute-0 systemd[1]: libpod-conmon-99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70.scope: Deactivated successfully.
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.623 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[d6f3d8a3-a829-4be9-b177-f1eb97d51aef]: (4, ("Mon Oct  6 02:22:20 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70)\n99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70\nMon Oct  6 02:22:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70)\n99762feee6e6f0531663264b5f7313288af8ef281096c5a1f68df5b72d87af70\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.625 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[469fc632-ef86-44fc-a918-1ffff847b0d8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.625 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.626 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4717632a-f5c3-49e7-a3aa-f3433d140880]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.627 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:20 compute-0 kernel: tap55ccf1b2-d0: left promiscuous mode
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.655 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a224e072-a09f-4a24-9b11-0b322f6ac9cf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.687 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f6d64e-a066-4191-8b3b-e71449a1da20]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.688 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4cc6f3-a8dc-4060-b818-622066f8a81c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.716 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3653c329-623c-43a0-854c-8605799b7222]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495027, 'reachable_time': 32929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225304, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.720 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:22:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:22:20.721 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[95b5d89e-4620-4b8b-9122-e542e3dc7f40]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:22:20 compute-0 systemd[1]: run-netns-ovnmeta\x2d55ccf1b2\x2dd24e\x2d4063\x2db15b\x2d60a65227d75e.mount: Deactivated successfully.
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.775 2 DEBUG nova.virt.libvirt.guest [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'f29ba44d-139b-42c0-8270-fb9071f47ce0' (instance-00000017) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.775 2 INFO nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Migration operation has completed
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.775 2 INFO nova.compute.manager [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] _post_live_migration() is started..
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.797 2 WARNING neutronclient.v2_0.client [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:22:20 compute-0 nova_compute[192903]: 2025-10-06 14:22:20.798 2 WARNING neutronclient.v2_0.client [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.344 2 DEBUG nova.network.neutron [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Activated binding for port 16d53745-b5c8-468d-889b-b34847d3dd0e and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.345 2 DEBUG nova.compute.manager [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.347 2 DEBUG nova.virt.libvirt.vif [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-489099389',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-489099389',id=23,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:21:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-bi03b5yx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:21:59Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=f29ba44d-139b-42c0-8270-fb9071f47ce0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.347 2 DEBUG nova.network.os_vif_util [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "16d53745-b5c8-468d-889b-b34847d3dd0e", "address": "fa:16:3e:00:d4:27", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16d53745-b5", "ovs_interfaceid": "16d53745-b5c8-468d-889b-b34847d3dd0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.348 2 DEBUG nova.network.os_vif_util [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:d4:27,bridge_name='br-int',has_traffic_filtering=True,id=16d53745-b5c8-468d-889b-b34847d3dd0e,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d53745-b5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.349 2 DEBUG os_vif [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:d4:27,bridge_name='br-int',has_traffic_filtering=True,id=16d53745-b5c8-468d-889b-b34847d3dd0e,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d53745-b5') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.354 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d53745-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.362 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9527fa08-5101-4b90-b777-b9b2dbe0059d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.368 2 INFO os_vif [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:d4:27,bridge_name='br-int',has_traffic_filtering=True,id=16d53745-b5c8-468d-889b-b34847d3dd0e,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16d53745-b5')
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.369 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.369 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.370 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.370 2 DEBUG nova.compute.manager [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.370 2 INFO nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Deleting instance files /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0_del
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.371 2 INFO nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Deletion of /var/lib/nova/instances/f29ba44d-139b-42c0-8270-fb9071f47ce0_del complete
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.572 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.572 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.573 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.573 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.573 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] No waiting events found dispatching network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.573 2 WARNING nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received unexpected event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e for instance with vm_state active and task_state migrating.
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.574 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.574 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.574 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.574 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.575 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] No waiting events found dispatching network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.575 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.575 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.575 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.576 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.576 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.576 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] No waiting events found dispatching network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.576 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-unplugged-16d53745-b5c8-468d-889b-b34847d3dd0e for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.576 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.577 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.577 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.577 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.577 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] No waiting events found dispatching network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.578 2 WARNING nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received unexpected event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e for instance with vm_state active and task_state migrating.
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.578 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.578 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.578 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.579 2 DEBUG oslo_concurrency.lockutils [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.579 2 DEBUG nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] No waiting events found dispatching network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:22:22 compute-0 nova_compute[192903]: 2025-10-06 14:22:22.579 2 WARNING nova.compute.manager [req-e1d4eb9b-2cc5-439b-a217-59aaea886163 req-abeab4ee-2ae6-4f85-b8ae-ad0085639a86 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Received unexpected event network-vif-plugged-16d53745-b5c8-468d-889b-b34847d3dd0e for instance with vm_state active and task_state migrating.
Oct 06 14:22:24 compute-0 nova_compute[192903]: 2025-10-06 14:22:24.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:25 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 06 14:22:27 compute-0 podman[225306]: 2025-10-06 14:22:27.241095605 +0000 UTC m=+0.100894160 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 14:22:27 compute-0 nova_compute[192903]: 2025-10-06 14:22:27.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:29 compute-0 nova_compute[192903]: 2025-10-06 14:22:29.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:29 compute-0 podman[203308]: time="2025-10-06T14:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:22:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:22:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Oct 06 14:22:30 compute-0 podman[225327]: 2025-10-06 14:22:30.216753515 +0000 UTC m=+0.080501285 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 06 14:22:31 compute-0 openstack_network_exporter[205500]: ERROR   14:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:22:31 compute-0 openstack_network_exporter[205500]: ERROR   14:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:22:31 compute-0 openstack_network_exporter[205500]: ERROR   14:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:22:31 compute-0 openstack_network_exporter[205500]: ERROR   14:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:22:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:22:31 compute-0 openstack_network_exporter[205500]: ERROR   14:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:22:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:22:32 compute-0 nova_compute[192903]: 2025-10-06 14:22:32.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:33 compute-0 nova_compute[192903]: 2025-10-06 14:22:33.411 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:33 compute-0 nova_compute[192903]: 2025-10-06 14:22:33.411 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:33 compute-0 nova_compute[192903]: 2025-10-06 14:22:33.412 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f29ba44d-139b-42c0-8270-fb9071f47ce0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:33 compute-0 nova_compute[192903]: 2025-10-06 14:22:33.928 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:33 compute-0 nova_compute[192903]: 2025-10-06 14:22:33.929 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:33 compute-0 nova_compute[192903]: 2025-10-06 14:22:33.929 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:33 compute-0 nova_compute[192903]: 2025-10-06 14:22:33.929 2 DEBUG nova.compute.resource_tracker [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:22:34 compute-0 nova_compute[192903]: 2025-10-06 14:22:34.137 2 WARNING nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:22:34 compute-0 nova_compute[192903]: 2025-10-06 14:22:34.140 2 DEBUG oslo_concurrency.processutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:22:34 compute-0 nova_compute[192903]: 2025-10-06 14:22:34.165 2 DEBUG oslo_concurrency.processutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:22:34 compute-0 nova_compute[192903]: 2025-10-06 14:22:34.166 2 DEBUG nova.compute.resource_tracker [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5848MB free_disk=73.3019905090332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:22:34 compute-0 nova_compute[192903]: 2025-10-06 14:22:34.166 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:22:34 compute-0 nova_compute[192903]: 2025-10-06 14:22:34.166 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:22:34 compute-0 nova_compute[192903]: 2025-10-06 14:22:34.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.184 2 DEBUG nova.compute.resource_tracker [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Migration for instance f29ba44d-139b-42c0-8270-fb9071f47ce0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.696 2 DEBUG nova.compute.resource_tracker [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.776 2 DEBUG nova.compute.resource_tracker [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Migration 2c27d4dd-209a-4cae-bfaf-651d66d96457 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.777 2 DEBUG nova.compute.resource_tracker [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.777 2 DEBUG nova.compute.resource_tracker [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:22:34 up  1:23,  0 user,  load average: 0.17, 0.33, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.796 2 DEBUG nova.scheduler.client.report [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.828 2 DEBUG nova.scheduler.client.report [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.829 2 DEBUG nova.compute.provider_tree [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.852 2 DEBUG nova.scheduler.client.report [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.873 2 DEBUG nova.scheduler.client.report [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STATUS_DISABLED,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 14:22:35 compute-0 nova_compute[192903]: 2025-10-06 14:22:35.906 2 DEBUG nova.compute.provider_tree [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:22:36 compute-0 nova_compute[192903]: 2025-10-06 14:22:36.412 2 DEBUG nova.scheduler.client.report [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:22:36 compute-0 nova_compute[192903]: 2025-10-06 14:22:36.924 2 DEBUG nova.compute.resource_tracker [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:22:36 compute-0 nova_compute[192903]: 2025-10-06 14:22:36.925 2 DEBUG oslo_concurrency.lockutils [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.758s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:22:36 compute-0 nova_compute[192903]: 2025-10-06 14:22:36.946 2 INFO nova.compute.manager [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 06 14:22:37 compute-0 nova_compute[192903]: 2025-10-06 14:22:37.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:38 compute-0 nova_compute[192903]: 2025-10-06 14:22:38.014 2 INFO nova.scheduler.client.report [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Deleted allocation for migration 2c27d4dd-209a-4cae-bfaf-651d66d96457
Oct 06 14:22:38 compute-0 nova_compute[192903]: 2025-10-06 14:22:38.014 2 DEBUG nova.virt.libvirt.driver [None req-e9ec52c2-639d-49da-b599-217293d9645e f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f29ba44d-139b-42c0-8270-fb9071f47ce0] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Oct 06 14:22:39 compute-0 nova_compute[192903]: 2025-10-06 14:22:39.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:42 compute-0 nova_compute[192903]: 2025-10-06 14:22:42.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:44 compute-0 nova_compute[192903]: 2025-10-06 14:22:44.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:46 compute-0 podman[225353]: 2025-10-06 14:22:46.235773314 +0000 UTC m=+0.076336475 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:22:46 compute-0 podman[225351]: 2025-10-06 14:22:46.251948128 +0000 UTC m=+0.100862699 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Oct 06 14:22:46 compute-0 podman[225352]: 2025-10-06 14:22:46.256648742 +0000 UTC m=+0.095726304 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 06 14:22:46 compute-0 podman[225350]: 2025-10-06 14:22:46.318028753 +0000 UTC m=+0.174023499 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 06 14:22:47 compute-0 nova_compute[192903]: 2025-10-06 14:22:47.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:49 compute-0 nova_compute[192903]: 2025-10-06 14:22:49.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:50 compute-0 nova_compute[192903]: 2025-10-06 14:22:50.044 2 DEBUG nova.compute.manager [None req-ff178160-9319-4e85-837d-8fbada86ee36 6fa7b295cc3748d282cf3095cde65304 fd142f68afa1489aa76784748e93db34 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Oct 06 14:22:50 compute-0 nova_compute[192903]: 2025-10-06 14:22:50.096 2 DEBUG nova.compute.provider_tree [None req-ff178160-9319-4e85-837d-8fbada86ee36 6fa7b295cc3748d282cf3095cde65304 fd142f68afa1489aa76784748e93db34 - - default default] Updating resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 generation from 27 to 29 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 06 14:22:52 compute-0 nova_compute[192903]: 2025-10-06 14:22:52.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:54 compute-0 nova_compute[192903]: 2025-10-06 14:22:54.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:57 compute-0 nova_compute[192903]: 2025-10-06 14:22:57.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:58 compute-0 podman[225431]: 2025-10-06 14:22:58.190493464 +0000 UTC m=+0.057087301 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:22:58 compute-0 nova_compute[192903]: 2025-10-06 14:22:58.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:22:59 compute-0 nova_compute[192903]: 2025-10-06 14:22:59.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:22:59 compute-0 nova_compute[192903]: 2025-10-06 14:22:59.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:22:59 compute-0 podman[203308]: time="2025-10-06T14:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:22:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:22:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.092 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.094 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.243 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.245 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.269 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.270 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5862MB free_disk=73.3019905090332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.271 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:23:00 compute-0 nova_compute[192903]: 2025-10-06 14:23:00.272 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:23:01 compute-0 podman[225452]: 2025-10-06 14:23:01.210200611 +0000 UTC m=+0.072504545 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9)
Oct 06 14:23:01 compute-0 nova_compute[192903]: 2025-10-06 14:23:01.319 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:23:01 compute-0 nova_compute[192903]: 2025-10-06 14:23:01.319 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:23:00 up  1:24,  0 user,  load average: 0.10, 0.29, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:23:01 compute-0 nova_compute[192903]: 2025-10-06 14:23:01.340 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:23:01 compute-0 openstack_network_exporter[205500]: ERROR   14:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:23:01 compute-0 openstack_network_exporter[205500]: ERROR   14:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:23:01 compute-0 openstack_network_exporter[205500]: ERROR   14:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:23:01 compute-0 openstack_network_exporter[205500]: ERROR   14:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:23:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:23:01 compute-0 openstack_network_exporter[205500]: ERROR   14:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:23:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:23:01 compute-0 nova_compute[192903]: 2025-10-06 14:23:01.848 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:23:02 compute-0 nova_compute[192903]: 2025-10-06 14:23:02.366 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:23:02 compute-0 nova_compute[192903]: 2025-10-06 14:23:02.366 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:23:02 compute-0 nova_compute[192903]: 2025-10-06 14:23:02.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:04 compute-0 nova_compute[192903]: 2025-10-06 14:23:04.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:06 compute-0 nova_compute[192903]: 2025-10-06 14:23:06.363 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:23:06 compute-0 nova_compute[192903]: 2025-10-06 14:23:06.363 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:23:06 compute-0 nova_compute[192903]: 2025-10-06 14:23:06.364 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:23:06 compute-0 nova_compute[192903]: 2025-10-06 14:23:06.364 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:23:06 compute-0 nova_compute[192903]: 2025-10-06 14:23:06.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:23:07 compute-0 nova_compute[192903]: 2025-10-06 14:23:07.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:09 compute-0 nova_compute[192903]: 2025-10-06 14:23:09.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:10 compute-0 nova_compute[192903]: 2025-10-06 14:23:10.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:23:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:23:11.391 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:23:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:23:11.392 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:23:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:23:11.392 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:23:12 compute-0 nova_compute[192903]: 2025-10-06 14:23:12.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:12 compute-0 nova_compute[192903]: 2025-10-06 14:23:12.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:23:14 compute-0 nova_compute[192903]: 2025-10-06 14:23:14.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:14 compute-0 nova_compute[192903]: 2025-10-06 14:23:14.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:23:17 compute-0 podman[225475]: 2025-10-06 14:23:17.212453438 +0000 UTC m=+0.072612228 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:23:17 compute-0 podman[225482]: 2025-10-06 14:23:17.214154452 +0000 UTC m=+0.065455180 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:23:17 compute-0 podman[225476]: 2025-10-06 14:23:17.229098235 +0000 UTC m=+0.078388000 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent)
Oct 06 14:23:17 compute-0 podman[225474]: 2025-10-06 14:23:17.239025145 +0000 UTC m=+0.109772853 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Oct 06 14:23:17 compute-0 nova_compute[192903]: 2025-10-06 14:23:17.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:19 compute-0 nova_compute[192903]: 2025-10-06 14:23:19.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:22 compute-0 nova_compute[192903]: 2025-10-06 14:23:22.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:24 compute-0 nova_compute[192903]: 2025-10-06 14:23:24.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:27 compute-0 nova_compute[192903]: 2025-10-06 14:23:27.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:29 compute-0 podman[225559]: 2025-10-06 14:23:29.201276073 +0000 UTC m=+0.062137353 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:23:29 compute-0 nova_compute[192903]: 2025-10-06 14:23:29.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:29 compute-0 podman[203308]: time="2025-10-06T14:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:23:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:23:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 06 14:23:31 compute-0 openstack_network_exporter[205500]: ERROR   14:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:23:31 compute-0 openstack_network_exporter[205500]: ERROR   14:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:23:31 compute-0 openstack_network_exporter[205500]: ERROR   14:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:23:31 compute-0 openstack_network_exporter[205500]: ERROR   14:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:23:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:23:31 compute-0 openstack_network_exporter[205500]: ERROR   14:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:23:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:23:32 compute-0 podman[225580]: 2025-10-06 14:23:32.221815432 +0000 UTC m=+0.084872049 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter)
Oct 06 14:23:32 compute-0 nova_compute[192903]: 2025-10-06 14:23:32.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:34 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:23:34.466 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:23:34 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:23:34.467 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:23:34 compute-0 nova_compute[192903]: 2025-10-06 14:23:34.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:34 compute-0 nova_compute[192903]: 2025-10-06 14:23:34.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:37 compute-0 nova_compute[192903]: 2025-10-06 14:23:37.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:39 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:23:39.468 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:23:39 compute-0 nova_compute[192903]: 2025-10-06 14:23:39.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:42 compute-0 nova_compute[192903]: 2025-10-06 14:23:42.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:44 compute-0 nova_compute[192903]: 2025-10-06 14:23:44.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:47 compute-0 nova_compute[192903]: 2025-10-06 14:23:47.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:48 compute-0 podman[225610]: 2025-10-06 14:23:48.219824637 +0000 UTC m=+0.068699915 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:23:48 compute-0 podman[225604]: 2025-10-06 14:23:48.235189231 +0000 UTC m=+0.078814691 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 06 14:23:48 compute-0 podman[225603]: 2025-10-06 14:23:48.266842542 +0000 UTC m=+0.114403525 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:23:48 compute-0 podman[225602]: 2025-10-06 14:23:48.273878267 +0000 UTC m=+0.136431374 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930)
Oct 06 14:23:49 compute-0 nova_compute[192903]: 2025-10-06 14:23:49.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:52 compute-0 nova_compute[192903]: 2025-10-06 14:23:52.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:54 compute-0 nova_compute[192903]: 2025-10-06 14:23:54.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:57 compute-0 nova_compute[192903]: 2025-10-06 14:23:57.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:58 compute-0 nova_compute[192903]: 2025-10-06 14:23:58.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:23:59 compute-0 nova_compute[192903]: 2025-10-06 14:23:59.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:23:59 compute-0 podman[203308]: time="2025-10-06T14:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:23:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:23:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 06 14:24:00 compute-0 podman[225689]: 2025-10-06 14:24:00.197177373 +0000 UTC m=+0.061618939 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:24:00 compute-0 nova_compute[192903]: 2025-10-06 14:24:00.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.106 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.107 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.107 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.107 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.256 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.258 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.280 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.281 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5865MB free_disk=73.3019905090332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.282 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:24:01 compute-0 nova_compute[192903]: 2025-10-06 14:24:01.282 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:24:01 compute-0 openstack_network_exporter[205500]: ERROR   14:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:24:01 compute-0 openstack_network_exporter[205500]: ERROR   14:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:24:01 compute-0 openstack_network_exporter[205500]: ERROR   14:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:24:01 compute-0 openstack_network_exporter[205500]: ERROR   14:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:24:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:24:01 compute-0 openstack_network_exporter[205500]: ERROR   14:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:24:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:24:02 compute-0 nova_compute[192903]: 2025-10-06 14:24:02.327 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:24:02 compute-0 nova_compute[192903]: 2025-10-06 14:24:02.328 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:24:01 up  1:25,  0 user,  load average: 0.09, 0.25, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:24:02 compute-0 nova_compute[192903]: 2025-10-06 14:24:02.348 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:24:02 compute-0 nova_compute[192903]: 2025-10-06 14:24:02.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:02 compute-0 nova_compute[192903]: 2025-10-06 14:24:02.854 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:24:03 compute-0 podman[225710]: 2025-10-06 14:24:03.194340118 +0000 UTC m=+0.061616279 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 06 14:24:03 compute-0 nova_compute[192903]: 2025-10-06 14:24:03.366 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:24:03 compute-0 nova_compute[192903]: 2025-10-06 14:24:03.367 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.085s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:24:03 compute-0 nova_compute[192903]: 2025-10-06 14:24:03.367 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:03 compute-0 nova_compute[192903]: 2025-10-06 14:24:03.367 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 14:24:03 compute-0 ovn_controller[95205]: 2025-10-06T14:24:03Z|00216|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 06 14:24:03 compute-0 nova_compute[192903]: 2025-10-06 14:24:03.874 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 14:24:04 compute-0 nova_compute[192903]: 2025-10-06 14:24:04.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:05 compute-0 nova_compute[192903]: 2025-10-06 14:24:05.014 2 DEBUG nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Creating tmpfile /var/lib/nova/instances/tmpwuzz4_w9 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:24:05 compute-0 nova_compute[192903]: 2025-10-06 14:24:05.015 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:05 compute-0 nova_compute[192903]: 2025-10-06 14:24:05.018 2 DEBUG nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Creating tmpfile /var/lib/nova/instances/tmpkk1eckqy to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:24:05 compute-0 nova_compute[192903]: 2025-10-06 14:24:05.019 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:05 compute-0 nova_compute[192903]: 2025-10-06 14:24:05.022 2 DEBUG nova.compute.manager [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwuzz4_w9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:24:05 compute-0 nova_compute[192903]: 2025-10-06 14:24:05.027 2 DEBUG nova.compute.manager [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkk1eckqy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:24:05 compute-0 nova_compute[192903]: 2025-10-06 14:24:05.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:05 compute-0 nova_compute[192903]: 2025-10-06 14:24:05.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:05 compute-0 nova_compute[192903]: 2025-10-06 14:24:05.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:07 compute-0 nova_compute[192903]: 2025-10-06 14:24:07.051 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:07 compute-0 nova_compute[192903]: 2025-10-06 14:24:07.056 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:07 compute-0 nova_compute[192903]: 2025-10-06 14:24:07.086 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:07 compute-0 nova_compute[192903]: 2025-10-06 14:24:07.087 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:24:07 compute-0 nova_compute[192903]: 2025-10-06 14:24:07.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:09 compute-0 nova_compute[192903]: 2025-10-06 14:24:09.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:11.393 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:24:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:11.393 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:24:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:11.393 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:24:11 compute-0 nova_compute[192903]: 2025-10-06 14:24:11.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:12 compute-0 nova_compute[192903]: 2025-10-06 14:24:12.242 2 DEBUG nova.compute.manager [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkk1eckqy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='91e3d012-6b97-466e-a069-c3e4424d2f67',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:24:12 compute-0 nova_compute[192903]: 2025-10-06 14:24:12.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:12 compute-0 nova_compute[192903]: 2025-10-06 14:24:12.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:13 compute-0 nova_compute[192903]: 2025-10-06 14:24:13.259 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-91e3d012-6b97-466e-a069-c3e4424d2f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:24:13 compute-0 nova_compute[192903]: 2025-10-06 14:24:13.260 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-91e3d012-6b97-466e-a069-c3e4424d2f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:24:13 compute-0 nova_compute[192903]: 2025-10-06 14:24:13.260 2 DEBUG nova.network.neutron [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:24:13 compute-0 nova_compute[192903]: 2025-10-06 14:24:13.769 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:14 compute-0 nova_compute[192903]: 2025-10-06 14:24:14.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:14 compute-0 nova_compute[192903]: 2025-10-06 14:24:14.604 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:14 compute-0 nova_compute[192903]: 2025-10-06 14:24:14.807 2 DEBUG nova.network.neutron [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Updating instance_info_cache with network_info: [{"id": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "address": "fa:16:3e:26:46:b4", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f56f13-1c", "ovs_interfaceid": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.314 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-91e3d012-6b97-466e-a069-c3e4424d2f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.329 2 DEBUG nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkk1eckqy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='91e3d012-6b97-466e-a069-c3e4424d2f67',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.330 2 DEBUG nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Creating instance directory: /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.331 2 DEBUG nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Creating disk.info with the contents: {'/var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk': 'qcow2', '/var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.331 2 DEBUG nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.332 2 DEBUG nova.objects.instance [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid 91e3d012-6b97-466e-a069-c3e4424d2f67 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.839 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.847 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.849 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.924 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.925 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.926 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.930 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.936 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.936 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.992 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:15 compute-0 nova_compute[192903]: 2025-10-06 14:24:15.993 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.031 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.033 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.033 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.101 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.103 2 DEBUG nova.virt.disk.api [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.103 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.186 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.187 2 DEBUG nova.virt.disk.api [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.188 2 DEBUG nova.objects.instance [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid 91e3d012-6b97-466e-a069-c3e4424d2f67 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.700 2 DEBUG nova.objects.base [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<91e3d012-6b97-466e-a069-c3e4424d2f67> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.701 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.741 2 DEBUG oslo_concurrency.processutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk.config 497664" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.742 2 DEBUG nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.744 2 DEBUG nova.virt.libvirt.vif [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:22:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1149461266',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1149461266',id=24,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:23:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-c006e5sm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:23:12Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=91e3d012-6b97-466e-a069-c3e4424d2f67,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "address": "fa:16:3e:26:46:b4", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap65f56f13-1c", "ovs_interfaceid": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.744 2 DEBUG nova.network.os_vif_util [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "address": "fa:16:3e:26:46:b4", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap65f56f13-1c", "ovs_interfaceid": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.745 2 DEBUG nova.network.os_vif_util [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:46:b4,bridge_name='br-int',has_traffic_filtering=True,id=65f56f13-1ccd-4101-aa20-05f1634ca2df,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f56f13-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.745 2 DEBUG os_vif [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:46:b4,bridge_name='br-int',has_traffic_filtering=True,id=65f56f13-1ccd-4101-aa20-05f1634ca2df,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f56f13-1c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.746 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.747 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.748 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ff64f44b-c055-5b7f-9435-1bba9c56e317', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.756 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65f56f13-1c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.757 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap65f56f13-1c, col_values=(('qos', UUID('6ed53db1-dfbf-42a9-834f-7c167051bebd')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.757 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap65f56f13-1c, col_values=(('external_ids', {'iface-id': '65f56f13-1ccd-4101-aa20-05f1634ca2df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:46:b4', 'vm-uuid': '91e3d012-6b97-466e-a069-c3e4424d2f67'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:16 compute-0 NetworkManager[52035]: <info>  [1759760656.7602] manager: (tap65f56f13-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.767 2 INFO os_vif [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:46:b4,bridge_name='br-int',has_traffic_filtering=True,id=65f56f13-1ccd-4101-aa20-05f1634ca2df,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f56f13-1c')
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.767 2 DEBUG nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.768 2 DEBUG nova.compute.manager [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkk1eckqy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='91e3d012-6b97-466e-a069-c3e4424d2f67',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.768 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:16 compute-0 nova_compute[192903]: 2025-10-06 14:24:16.894 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:17 compute-0 nova_compute[192903]: 2025-10-06 14:24:17.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:17.591 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:24:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:17.592 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:24:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:18.593 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:18 compute-0 nova_compute[192903]: 2025-10-06 14:24:18.636 2 DEBUG nova.network.neutron [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Port 65f56f13-1ccd-4101-aa20-05f1634ca2df updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:24:18 compute-0 nova_compute[192903]: 2025-10-06 14:24:18.647 2 DEBUG nova.compute.manager [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkk1eckqy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='91e3d012-6b97-466e-a069-c3e4424d2f67',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:24:19 compute-0 podman[225755]: 2025-10-06 14:24:19.202125553 +0000 UTC m=+0.061767203 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 06 14:24:19 compute-0 podman[225756]: 2025-10-06 14:24:19.203825157 +0000 UTC m=+0.062278226 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 14:24:19 compute-0 podman[225757]: 2025-10-06 14:24:19.210819821 +0000 UTC m=+0.069164747 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:24:19 compute-0 podman[225754]: 2025-10-06 14:24:19.231161455 +0000 UTC m=+0.097404209 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 06 14:24:19 compute-0 nova_compute[192903]: 2025-10-06 14:24:19.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:24:19 compute-0 nova_compute[192903]: 2025-10-06 14:24:19.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 14:24:19 compute-0 nova_compute[192903]: 2025-10-06 14:24:19.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:21 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 06 14:24:21 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 06 14:24:21 compute-0 kernel: tap65f56f13-1c: entered promiscuous mode
Oct 06 14:24:21 compute-0 nova_compute[192903]: 2025-10-06 14:24:21.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:21 compute-0 nova_compute[192903]: 2025-10-06 14:24:21.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:21 compute-0 ovn_controller[95205]: 2025-10-06T14:24:21Z|00217|binding|INFO|Claiming lport 65f56f13-1ccd-4101-aa20-05f1634ca2df for this additional chassis.
Oct 06 14:24:21 compute-0 ovn_controller[95205]: 2025-10-06T14:24:21Z|00218|binding|INFO|65f56f13-1ccd-4101-aa20-05f1634ca2df: Claiming fa:16:3e:26:46:b4 10.100.0.11
Oct 06 14:24:21 compute-0 NetworkManager[52035]: <info>  [1759760661.7661] manager: (tap65f56f13-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.771 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:46:b4 10.100.0.11'], port_security=['fa:16:3e:26:46:b4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '91e3d012-6b97-466e-a069-c3e4424d2f67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=65f56f13-1ccd-4101-aa20-05f1634ca2df) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.773 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 65f56f13-1ccd-4101-aa20-05f1634ca2df in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.774 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:24:21 compute-0 nova_compute[192903]: 2025-10-06 14:24:21.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:21 compute-0 ovn_controller[95205]: 2025-10-06T14:24:21Z|00219|binding|INFO|Setting lport 65f56f13-1ccd-4101-aa20-05f1634ca2df ovn-installed in OVS
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.794 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1b88c859-26aa-48af-a9d0-0fb9088fc4c2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.795 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55ccf1b2-d1 in ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:24:21 compute-0 nova_compute[192903]: 2025-10-06 14:24:21.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.798 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55ccf1b2-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.798 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5de037bd-a81f-43f4-9c30-3c1680f198d4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:21 compute-0 nova_compute[192903]: 2025-10-06 14:24:21.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.800 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[41497534-c3a4-4a7e-a2fe-6356c7c11eb8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:21 compute-0 systemd-udevd[225877]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.826 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[abbf3e07-dbc9-4058-a79b-c1c180b4a04c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:21 compute-0 systemd-machined[152985]: New machine qemu-19-instance-00000018.
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.848 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1049d1a1-3fa8-4be4-8d16-481d79c10935]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:21 compute-0 NetworkManager[52035]: <info>  [1759760661.8498] device (tap65f56f13-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:24:21 compute-0 NetworkManager[52035]: <info>  [1759760661.8512] device (tap65f56f13-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:24:21 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000018.
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.886 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed7a177-c22b-4c03-9021-968eec4242d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:21 compute-0 NetworkManager[52035]: <info>  [1759760661.8919] manager: (tap55ccf1b2-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.893 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[383fb83e-9199-4b1d-8123-0dd15fa71372]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.935 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[0728f1e6-169e-4f04-9c85-967f3c9acca4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.938 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[29d41765-0735-44cb-8c4a-5da58108ef71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:21 compute-0 NetworkManager[52035]: <info>  [1759760661.9707] device (tap55ccf1b2-d0): carrier: link connected
Oct 06 14:24:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:21.982 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec3052e-a42d-4fc5-b1d1-5bae3b0eefe3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.007 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[382110d5-a16d-4c89-8dac-0957e0ad58ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512361, 'reachable_time': 32193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225908, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.049 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[93ebd2ed-c72c-4194-a0d4-f6a098c7ed9f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:aab9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512361, 'tstamp': 512361}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225909, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.074 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1d597bec-5c9f-46f1-8d8d-88078d89dd46]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512361, 'reachable_time': 32193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225910, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.122 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[0bfe48b2-256c-4a15-ad5e-8059b3969104]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.204 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a376d2f6-fb9a-4830-946e-5e4e24d82300]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.205 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.206 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.207 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:22 compute-0 nova_compute[192903]: 2025-10-06 14:24:22.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:22 compute-0 kernel: tap55ccf1b2-d0: entered promiscuous mode
Oct 06 14:24:22 compute-0 NetworkManager[52035]: <info>  [1759760662.2107] manager: (tap55ccf1b2-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.212 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:22 compute-0 nova_compute[192903]: 2025-10-06 14:24:22.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:22 compute-0 ovn_controller[95205]: 2025-10-06T14:24:22Z|00220|binding|INFO|Releasing lport 0ee47753-a40c-4a21-a6ed-65093b6727d9 from this chassis (sb_readonly=0)
Oct 06 14:24:22 compute-0 nova_compute[192903]: 2025-10-06 14:24:22.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.229 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6cabb567-4169-4c98-b775-e336207b616a]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.229 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.230 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.230 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 55ccf1b2-d24e-4063-b15b-60a65227d75e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.230 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.231 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[704c039e-9715-4bb0-8f60-94e85fe15116]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.231 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.232 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b7e7388e-fd60-418c-a94d-4b56e442d4b3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.232 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:24:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:22.233 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'env', 'PROCESS_TAG=haproxy-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55ccf1b2-d24e-4063-b15b-60a65227d75e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:24:22 compute-0 podman[225949]: 2025-10-06 14:24:22.679019956 +0000 UTC m=+0.070311228 container create e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930)
Oct 06 14:24:22 compute-0 systemd[1]: Started libpod-conmon-e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a.scope.
Oct 06 14:24:22 compute-0 podman[225949]: 2025-10-06 14:24:22.639706833 +0000 UTC m=+0.030998195 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:24:22 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:24:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6961321e90d65a514ae7775959adb6aed18a18864b37d4f5ec18559c94f164e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:24:22 compute-0 podman[225949]: 2025-10-06 14:24:22.788114661 +0000 UTC m=+0.179405963 container init e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Oct 06 14:24:22 compute-0 podman[225949]: 2025-10-06 14:24:22.794352144 +0000 UTC m=+0.185643416 container start e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Oct 06 14:24:22 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[225964]: [NOTICE]   (225968) : New worker (225970) forked
Oct 06 14:24:22 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[225964]: [NOTICE]   (225968) : Loading success.
Oct 06 14:24:24 compute-0 ovn_controller[95205]: 2025-10-06T14:24:24Z|00221|binding|INFO|Claiming lport 65f56f13-1ccd-4101-aa20-05f1634ca2df for this chassis.
Oct 06 14:24:24 compute-0 ovn_controller[95205]: 2025-10-06T14:24:24Z|00222|binding|INFO|65f56f13-1ccd-4101-aa20-05f1634ca2df: Claiming fa:16:3e:26:46:b4 10.100.0.11
Oct 06 14:24:24 compute-0 ovn_controller[95205]: 2025-10-06T14:24:24Z|00223|binding|INFO|Setting lport 65f56f13-1ccd-4101-aa20-05f1634ca2df up in Southbound
Oct 06 14:24:24 compute-0 nova_compute[192903]: 2025-10-06 14:24:24.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:25 compute-0 nova_compute[192903]: 2025-10-06 14:24:25.648 2 INFO nova.compute.manager [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Post operation of migration started
Oct 06 14:24:25 compute-0 nova_compute[192903]: 2025-10-06 14:24:25.650 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:25 compute-0 nova_compute[192903]: 2025-10-06 14:24:25.724 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:25 compute-0 nova_compute[192903]: 2025-10-06 14:24:25.725 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:25 compute-0 nova_compute[192903]: 2025-10-06 14:24:25.815 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-91e3d012-6b97-466e-a069-c3e4424d2f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:24:25 compute-0 nova_compute[192903]: 2025-10-06 14:24:25.815 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-91e3d012-6b97-466e-a069-c3e4424d2f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:24:25 compute-0 nova_compute[192903]: 2025-10-06 14:24:25.815 2 DEBUG nova.network.neutron [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:24:26 compute-0 nova_compute[192903]: 2025-10-06 14:24:26.322 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:26 compute-0 nova_compute[192903]: 2025-10-06 14:24:26.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:29 compute-0 nova_compute[192903]: 2025-10-06 14:24:29.512 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:29 compute-0 nova_compute[192903]: 2025-10-06 14:24:29.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:29 compute-0 podman[203308]: time="2025-10-06T14:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:24:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:24:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Oct 06 14:24:30 compute-0 nova_compute[192903]: 2025-10-06 14:24:30.591 2 DEBUG nova.network.neutron [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Updating instance_info_cache with network_info: [{"id": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "address": "fa:16:3e:26:46:b4", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f56f13-1c", "ovs_interfaceid": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:24:31 compute-0 nova_compute[192903]: 2025-10-06 14:24:31.100 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-91e3d012-6b97-466e-a069-c3e4424d2f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:24:31 compute-0 podman[225994]: 2025-10-06 14:24:31.202438561 +0000 UTC m=+0.066117867 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:24:31 compute-0 openstack_network_exporter[205500]: ERROR   14:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:24:31 compute-0 openstack_network_exporter[205500]: ERROR   14:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:24:31 compute-0 openstack_network_exporter[205500]: ERROR   14:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:24:31 compute-0 openstack_network_exporter[205500]: ERROR   14:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:24:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:24:31 compute-0 openstack_network_exporter[205500]: ERROR   14:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:24:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:24:31 compute-0 nova_compute[192903]: 2025-10-06 14:24:31.623 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:24:31 compute-0 nova_compute[192903]: 2025-10-06 14:24:31.624 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:24:31 compute-0 nova_compute[192903]: 2025-10-06 14:24:31.624 2 DEBUG oslo_concurrency.lockutils [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:24:31 compute-0 nova_compute[192903]: 2025-10-06 14:24:31.630 2 INFO nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:24:31 compute-0 virtqemud[192802]: Domain id=19 name='instance-00000018' uuid=91e3d012-6b97-466e-a069-c3e4424d2f67 is tainted: custom-monitor
Oct 06 14:24:31 compute-0 nova_compute[192903]: 2025-10-06 14:24:31.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:32 compute-0 nova_compute[192903]: 2025-10-06 14:24:32.639 2 INFO nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:24:33 compute-0 nova_compute[192903]: 2025-10-06 14:24:33.647 2 INFO nova.virt.libvirt.driver [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:24:33 compute-0 nova_compute[192903]: 2025-10-06 14:24:33.653 2 DEBUG nova.compute.manager [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:24:34 compute-0 nova_compute[192903]: 2025-10-06 14:24:34.168 2 DEBUG nova.objects.instance [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:24:34 compute-0 podman[226015]: 2025-10-06 14:24:34.235837598 +0000 UTC m=+0.089174823 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Oct 06 14:24:34 compute-0 nova_compute[192903]: 2025-10-06 14:24:34.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:35 compute-0 nova_compute[192903]: 2025-10-06 14:24:35.185 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:35 compute-0 nova_compute[192903]: 2025-10-06 14:24:35.479 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:35 compute-0 nova_compute[192903]: 2025-10-06 14:24:35.480 2 WARNING neutronclient.v2_0.client [None req-ceaa754e-95c2-4242-b7c6-6fb0f937d08c f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:36 compute-0 nova_compute[192903]: 2025-10-06 14:24:36.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:39 compute-0 nova_compute[192903]: 2025-10-06 14:24:39.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:41 compute-0 nova_compute[192903]: 2025-10-06 14:24:41.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:43 compute-0 nova_compute[192903]: 2025-10-06 14:24:43.851 2 DEBUG nova.compute.manager [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwuzz4_w9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='40adb852-2234-400a-bacc-bbe8ecad4a52',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:24:44 compute-0 nova_compute[192903]: 2025-10-06 14:24:44.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:44 compute-0 nova_compute[192903]: 2025-10-06 14:24:44.866 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-40adb852-2234-400a-bacc-bbe8ecad4a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:24:44 compute-0 nova_compute[192903]: 2025-10-06 14:24:44.867 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-40adb852-2234-400a-bacc-bbe8ecad4a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:24:44 compute-0 nova_compute[192903]: 2025-10-06 14:24:44.867 2 DEBUG nova.network.neutron [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:24:45 compute-0 nova_compute[192903]: 2025-10-06 14:24:45.375 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:46 compute-0 nova_compute[192903]: 2025-10-06 14:24:46.472 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:46 compute-0 nova_compute[192903]: 2025-10-06 14:24:46.669 2 DEBUG nova.network.neutron [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Updating instance_info_cache with network_info: [{"id": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "address": "fa:16:3e:1e:d1:d3", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc4c824-71", "ovs_interfaceid": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:24:46 compute-0 nova_compute[192903]: 2025-10-06 14:24:46.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.212 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-40adb852-2234-400a-bacc-bbe8ecad4a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.244 2 DEBUG nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwuzz4_w9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='40adb852-2234-400a-bacc-bbe8ecad4a52',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.244 2 DEBUG nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Creating instance directory: /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.245 2 DEBUG nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Creating disk.info with the contents: {'/var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk': 'qcow2', '/var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.245 2 DEBUG nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.245 2 DEBUG nova.objects.instance [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid 40adb852-2234-400a-bacc-bbe8ecad4a52 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.755 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.760 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.762 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.828 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.829 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.830 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.830 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.833 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.834 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.881 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.882 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.915 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.916 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.916 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.976 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.977 2 DEBUG nova.virt.disk.api [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:24:47 compute-0 nova_compute[192903]: 2025-10-06 14:24:47.978 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.040 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.042 2 DEBUG nova.virt.disk.api [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.043 2 DEBUG nova.objects.instance [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid 40adb852-2234-400a-bacc-bbe8ecad4a52 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.552 2 DEBUG nova.objects.base [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<40adb852-2234-400a-bacc-bbe8ecad4a52> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.553 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.582 2 DEBUG oslo_concurrency.processutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk.config 497664" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.583 2 DEBUG nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.584 2 DEBUG nova.virt.libvirt.vif [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:23:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1320006994',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1320006994',id=25,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:23:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-w6qigexn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:23:35Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=40adb852-2234-400a-bacc-bbe8ecad4a52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "address": "fa:16:3e:1e:d1:d3", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7cc4c824-71", "ovs_interfaceid": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.584 2 DEBUG nova.network.os_vif_util [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "address": "fa:16:3e:1e:d1:d3", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7cc4c824-71", "ovs_interfaceid": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.585 2 DEBUG nova.network.os_vif_util [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:d1:d3,bridge_name='br-int',has_traffic_filtering=True,id=7cc4c824-7161-48a1-88e5-a3cc6e77170b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cc4c824-71') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.585 2 DEBUG os_vif [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:d1:d3,bridge_name='br-int',has_traffic_filtering=True,id=7cc4c824-7161-48a1-88e5-a3cc6e77170b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cc4c824-71') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.586 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.587 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.588 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '01801c0e-6f12-5219-a028-ebb5273ab92d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.593 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7cc4c824-71, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.594 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7cc4c824-71, col_values=(('qos', UUID('5e65d6a9-b5ca-457e-9e4e-1b2792430aa3')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.594 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7cc4c824-71, col_values=(('external_ids', {'iface-id': '7cc4c824-7161-48a1-88e5-a3cc6e77170b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:d1:d3', 'vm-uuid': '40adb852-2234-400a-bacc-bbe8ecad4a52'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:48 compute-0 NetworkManager[52035]: <info>  [1759760688.5970] manager: (tap7cc4c824-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.604 2 INFO os_vif [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:d1:d3,bridge_name='br-int',has_traffic_filtering=True,id=7cc4c824-7161-48a1-88e5-a3cc6e77170b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cc4c824-71')
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.604 2 DEBUG nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.605 2 DEBUG nova.compute.manager [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwuzz4_w9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='40adb852-2234-400a-bacc-bbe8ecad4a52',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:24:48 compute-0 nova_compute[192903]: 2025-10-06 14:24:48.606 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:49 compute-0 nova_compute[192903]: 2025-10-06 14:24:49.461 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:49 compute-0 nova_compute[192903]: 2025-10-06 14:24:49.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:50 compute-0 nova_compute[192903]: 2025-10-06 14:24:50.173 2 DEBUG nova.network.neutron [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Port 7cc4c824-7161-48a1-88e5-a3cc6e77170b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:24:50 compute-0 nova_compute[192903]: 2025-10-06 14:24:50.186 2 DEBUG nova.compute.manager [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwuzz4_w9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='40adb852-2234-400a-bacc-bbe8ecad4a52',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:24:50 compute-0 podman[226059]: 2025-10-06 14:24:50.255523597 +0000 UTC m=+0.091019851 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:24:50 compute-0 podman[226058]: 2025-10-06 14:24:50.2586714 +0000 UTC m=+0.098596400 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 14:24:50 compute-0 podman[226057]: 2025-10-06 14:24:50.277593547 +0000 UTC m=+0.131168816 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 06 14:24:50 compute-0 podman[226065]: 2025-10-06 14:24:50.289992072 +0000 UTC m=+0.120444744 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:24:53 compute-0 nova_compute[192903]: 2025-10-06 14:24:53.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:53 compute-0 kernel: tap7cc4c824-71: entered promiscuous mode
Oct 06 14:24:53 compute-0 NetworkManager[52035]: <info>  [1759760693.6471] manager: (tap7cc4c824-71): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Oct 06 14:24:53 compute-0 ovn_controller[95205]: 2025-10-06T14:24:53Z|00224|binding|INFO|Claiming lport 7cc4c824-7161-48a1-88e5-a3cc6e77170b for this additional chassis.
Oct 06 14:24:53 compute-0 ovn_controller[95205]: 2025-10-06T14:24:53Z|00225|binding|INFO|7cc4c824-7161-48a1-88e5-a3cc6e77170b: Claiming fa:16:3e:1e:d1:d3 10.100.0.6
Oct 06 14:24:53 compute-0 nova_compute[192903]: 2025-10-06 14:24:53.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.658 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:d1:d3 10.100.0.6'], port_security=['fa:16:3e:1e:d1:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '40adb852-2234-400a-bacc-bbe8ecad4a52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=7cc4c824-7161-48a1-88e5-a3cc6e77170b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.659 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 7cc4c824-7161-48a1-88e5-a3cc6e77170b in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.660 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:24:53 compute-0 ovn_controller[95205]: 2025-10-06T14:24:53Z|00226|binding|INFO|Setting lport 7cc4c824-7161-48a1-88e5-a3cc6e77170b ovn-installed in OVS
Oct 06 14:24:53 compute-0 nova_compute[192903]: 2025-10-06 14:24:53.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.679 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea35dce-ac75-45b5-b3b1-e359f8d9ba1f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:53 compute-0 systemd-machined[152985]: New machine qemu-20-instance-00000019.
Oct 06 14:24:53 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-00000019.
Oct 06 14:24:53 compute-0 systemd-udevd[226155]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.717 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[5fce4f61-38cf-40a2-881b-81588c471e8f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.721 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[2944171c-cb4d-4ad8-a760-d090234b4da8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:53 compute-0 NetworkManager[52035]: <info>  [1759760693.7381] device (tap7cc4c824-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:24:53 compute-0 NetworkManager[52035]: <info>  [1759760693.7389] device (tap7cc4c824-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.756 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[c707f1f0-9758-4600-959d-4fbc90e7e6cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.785 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f8faef-72c2-4800-bb5e-c578f240d450]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512361, 'reachable_time': 32193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226165, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.803 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbb03e2-3cf4-478d-a313-47bb5d497e9a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512380, 'tstamp': 512380}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226167, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512384, 'tstamp': 512384}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226167, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.805 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:53 compute-0 nova_compute[192903]: 2025-10-06 14:24:53.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:53 compute-0 nova_compute[192903]: 2025-10-06 14:24:53.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.808 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.809 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.809 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.809 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:24:53 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:24:53.811 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[f9b5b9cb-0fd6-45e9-8a62-571e02e8f926]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:24:54 compute-0 nova_compute[192903]: 2025-10-06 14:24:54.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:56 compute-0 ovn_controller[95205]: 2025-10-06T14:24:56Z|00227|binding|INFO|Claiming lport 7cc4c824-7161-48a1-88e5-a3cc6e77170b for this chassis.
Oct 06 14:24:56 compute-0 ovn_controller[95205]: 2025-10-06T14:24:56Z|00228|binding|INFO|7cc4c824-7161-48a1-88e5-a3cc6e77170b: Claiming fa:16:3e:1e:d1:d3 10.100.0.6
Oct 06 14:24:56 compute-0 ovn_controller[95205]: 2025-10-06T14:24:56Z|00229|binding|INFO|Setting lport 7cc4c824-7161-48a1-88e5-a3cc6e77170b up in Southbound
Oct 06 14:24:57 compute-0 nova_compute[192903]: 2025-10-06 14:24:57.947 2 INFO nova.compute.manager [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Post operation of migration started
Oct 06 14:24:57 compute-0 nova_compute[192903]: 2025-10-06 14:24:57.948 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:58 compute-0 nova_compute[192903]: 2025-10-06 14:24:58.449 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:58 compute-0 nova_compute[192903]: 2025-10-06 14:24:58.451 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:58 compute-0 nova_compute[192903]: 2025-10-06 14:24:58.545 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-40adb852-2234-400a-bacc-bbe8ecad4a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:24:58 compute-0 nova_compute[192903]: 2025-10-06 14:24:58.546 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-40adb852-2234-400a-bacc-bbe8ecad4a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:24:58 compute-0 nova_compute[192903]: 2025-10-06 14:24:58.546 2 DEBUG nova.network.neutron [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:24:58 compute-0 nova_compute[192903]: 2025-10-06 14:24:58.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:59 compute-0 nova_compute[192903]: 2025-10-06 14:24:59.053 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:59 compute-0 nova_compute[192903]: 2025-10-06 14:24:59.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:24:59 compute-0 podman[203308]: time="2025-10-06T14:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:24:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:24:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3486 "" "Go-http-client/1.1"
Oct 06 14:24:59 compute-0 nova_compute[192903]: 2025-10-06 14:24:59.782 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:24:59 compute-0 nova_compute[192903]: 2025-10-06 14:24:59.956 2 DEBUG nova.network.neutron [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Updating instance_info_cache with network_info: [{"id": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "address": "fa:16:3e:1e:d1:d3", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc4c824-71", "ovs_interfaceid": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:25:00 compute-0 nova_compute[192903]: 2025-10-06 14:25:00.089 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:25:00 compute-0 nova_compute[192903]: 2025-10-06 14:25:00.464 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-40adb852-2234-400a-bacc-bbe8ecad4a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:25:00 compute-0 nova_compute[192903]: 2025-10-06 14:25:00.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:25:00 compute-0 nova_compute[192903]: 2025-10-06 14:25:00.994 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:00 compute-0 nova_compute[192903]: 2025-10-06 14:25:00.995 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:00 compute-0 nova_compute[192903]: 2025-10-06 14:25:00.995 2 DEBUG oslo_concurrency.lockutils [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:01 compute-0 nova_compute[192903]: 2025-10-06 14:25:01.001 2 INFO nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:25:01 compute-0 virtqemud[192802]: Domain id=20 name='instance-00000019' uuid=40adb852-2234-400a-bacc-bbe8ecad4a52 is tainted: custom-monitor
Oct 06 14:25:01 compute-0 nova_compute[192903]: 2025-10-06 14:25:01.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:01 compute-0 nova_compute[192903]: 2025-10-06 14:25:01.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:01 compute-0 nova_compute[192903]: 2025-10-06 14:25:01.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:01 compute-0 nova_compute[192903]: 2025-10-06 14:25:01.097 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:25:01 compute-0 openstack_network_exporter[205500]: ERROR   14:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:25:01 compute-0 openstack_network_exporter[205500]: ERROR   14:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:25:01 compute-0 openstack_network_exporter[205500]: ERROR   14:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:25:01 compute-0 openstack_network_exporter[205500]: ERROR   14:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:25:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:25:01 compute-0 openstack_network_exporter[205500]: ERROR   14:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:25:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.010 2 INFO nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.200 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:25:02 compute-0 podman[226190]: 2025-10-06 14:25:02.22369259 +0000 UTC m=+0.084660234 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.297 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.298 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.367 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.373 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.422 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.422 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.473 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.684 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.685 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.718 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.718 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5544MB free_disk=73.24245834350586GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.718 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:02 compute-0 nova_compute[192903]: 2025-10-06 14:25:02.719 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:03 compute-0 nova_compute[192903]: 2025-10-06 14:25:03.015 2 INFO nova.virt.libvirt.driver [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:25:03 compute-0 nova_compute[192903]: 2025-10-06 14:25:03.022 2 DEBUG nova.compute.manager [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:25:03 compute-0 nova_compute[192903]: 2025-10-06 14:25:03.531 2 DEBUG nova.objects.instance [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:25:03 compute-0 nova_compute[192903]: 2025-10-06 14:25:03.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:04 compute-0 nova_compute[192903]: 2025-10-06 14:25:04.251 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Migration for instance 40adb852-2234-400a-bacc-bbe8ecad4a52 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 06 14:25:04 compute-0 nova_compute[192903]: 2025-10-06 14:25:04.548 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:25:04 compute-0 nova_compute[192903]: 2025-10-06 14:25:04.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:04 compute-0 nova_compute[192903]: 2025-10-06 14:25:04.734 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:25:04 compute-0 nova_compute[192903]: 2025-10-06 14:25:04.735 2 WARNING neutronclient.v2_0.client [None req-a94926aa-bba5-4e39-8c97-cc36f8f76b65 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:25:04 compute-0 nova_compute[192903]: 2025-10-06 14:25:04.758 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 06 14:25:05 compute-0 podman[226225]: 2025-10-06 14:25:05.227806747 +0000 UTC m=+0.081793619 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 06 14:25:05 compute-0 nova_compute[192903]: 2025-10-06 14:25:05.300 2 WARNING nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 40adb852-2234-400a-bacc-bbe8ecad4a52 is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.
Oct 06 14:25:05 compute-0 nova_compute[192903]: 2025-10-06 14:25:05.301 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 91e3d012-6b97-466e-a069-c3e4424d2f67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:25:05 compute-0 nova_compute[192903]: 2025-10-06 14:25:05.301 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:25:05 compute-0 nova_compute[192903]: 2025-10-06 14:25:05.301 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:25:02 up  1:26,  0 user,  load average: 0.11, 0.22, 0.32\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_8f3f3b7d20fc4715811486da569fc0ab': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:25:05 compute-0 nova_compute[192903]: 2025-10-06 14:25:05.381 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:25:05 compute-0 nova_compute[192903]: 2025-10-06 14:25:05.898 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:25:06 compute-0 nova_compute[192903]: 2025-10-06 14:25:06.441 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:25:06 compute-0 nova_compute[192903]: 2025-10-06 14:25:06.441 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.723s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:07 compute-0 nova_compute[192903]: 2025-10-06 14:25:07.827 2 DEBUG oslo_concurrency.lockutils [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "40adb852-2234-400a-bacc-bbe8ecad4a52" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:07 compute-0 nova_compute[192903]: 2025-10-06 14:25:07.828 2 DEBUG oslo_concurrency.lockutils [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "40adb852-2234-400a-bacc-bbe8ecad4a52" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:07 compute-0 nova_compute[192903]: 2025-10-06 14:25:07.828 2 DEBUG oslo_concurrency.lockutils [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "40adb852-2234-400a-bacc-bbe8ecad4a52-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:07 compute-0 nova_compute[192903]: 2025-10-06 14:25:07.829 2 DEBUG oslo_concurrency.lockutils [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "40adb852-2234-400a-bacc-bbe8ecad4a52-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:07 compute-0 nova_compute[192903]: 2025-10-06 14:25:07.829 2 DEBUG oslo_concurrency.lockutils [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "40adb852-2234-400a-bacc-bbe8ecad4a52-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:07 compute-0 nova_compute[192903]: 2025-10-06 14:25:07.847 2 INFO nova.compute.manager [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Terminating instance
Oct 06 14:25:08 compute-0 nova_compute[192903]: 2025-10-06 14:25:08.363 2 DEBUG nova.compute.manager [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:25:08 compute-0 kernel: tap7cc4c824-71 (unregistering): left promiscuous mode
Oct 06 14:25:08 compute-0 NetworkManager[52035]: <info>  [1759760708.3924] device (tap7cc4c824-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:25:08 compute-0 nova_compute[192903]: 2025-10-06 14:25:08.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:08 compute-0 ovn_controller[95205]: 2025-10-06T14:25:08Z|00230|binding|INFO|Releasing lport 7cc4c824-7161-48a1-88e5-a3cc6e77170b from this chassis (sb_readonly=0)
Oct 06 14:25:08 compute-0 ovn_controller[95205]: 2025-10-06T14:25:08Z|00231|binding|INFO|Setting lport 7cc4c824-7161-48a1-88e5-a3cc6e77170b down in Southbound
Oct 06 14:25:08 compute-0 ovn_controller[95205]: 2025-10-06T14:25:08Z|00232|binding|INFO|Removing iface tap7cc4c824-71 ovn-installed in OVS
Oct 06 14:25:08 compute-0 nova_compute[192903]: 2025-10-06 14:25:08.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.414 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:d1:d3 10.100.0.6'], port_security=['fa:16:3e:1e:d1:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '40adb852-2234-400a-bacc-bbe8ecad4a52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=7cc4c824-7161-48a1-88e5-a3cc6e77170b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.415 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 7cc4c824-7161-48a1-88e5-a3cc6e77170b in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.417 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:25:08 compute-0 nova_compute[192903]: 2025-10-06 14:25:08.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.440 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b07ac759-d2c1-453d-8739-884a8dcabe90]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:08 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000019.scope: Deactivated successfully.
Oct 06 14:25:08 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000019.scope: Consumed 2.764s CPU time.
Oct 06 14:25:08 compute-0 systemd-machined[152985]: Machine qemu-20-instance-00000019 terminated.
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.484 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[30faa0fd-7a91-4227-b890-7dae125eefd6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.489 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[ea693b5b-f5db-4c37-a626-8f63ce599fe9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.531 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[7527355b-98e5-4731-978c-7f6518e72784]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.550 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[73e9f6e6-2045-44c8-ab7a-bf91063d7ff7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 47, 'tx_packets': 7, 'rx_bytes': 2470, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 47, 'tx_packets': 7, 'rx_bytes': 2470, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512361, 'reachable_time': 28259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226258, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.572 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3719a409-0d0b-4b6f-8740-750097c798ea]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512380, 'tstamp': 512380}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226259, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512384, 'tstamp': 512384}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226259, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.574 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:25:08 compute-0 nova_compute[192903]: 2025-10-06 14:25:08.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:08 compute-0 nova_compute[192903]: 2025-10-06 14:25:08.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.582 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.582 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.583 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.583 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:25:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:08.586 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ede6e6-7eec-4270-a1c6-0b96d8a197eb]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:08 compute-0 nova_compute[192903]: 2025-10-06 14:25:08.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:08 compute-0 nova_compute[192903]: 2025-10-06 14:25:08.659 2 INFO nova.virt.libvirt.driver [-] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Instance destroyed successfully.
Oct 06 14:25:08 compute-0 nova_compute[192903]: 2025-10-06 14:25:08.659 2 DEBUG nova.objects.instance [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'resources' on Instance uuid 40adb852-2234-400a-bacc-bbe8ecad4a52 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.170 2 DEBUG nova.virt.libvirt.vif [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:23:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1320006994',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1320006994',id=25,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:23:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-w6qigexn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:25:04Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=40adb852-2234-400a-bacc-bbe8ecad4a52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "address": "fa:16:3e:1e:d1:d3", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc4c824-71", "ovs_interfaceid": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.171 2 DEBUG nova.network.os_vif_util [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "address": "fa:16:3e:1e:d1:d3", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7cc4c824-71", "ovs_interfaceid": "7cc4c824-7161-48a1-88e5-a3cc6e77170b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.172 2 DEBUG nova.network.os_vif_util [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:d1:d3,bridge_name='br-int',has_traffic_filtering=True,id=7cc4c824-7161-48a1-88e5-a3cc6e77170b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cc4c824-71') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.172 2 DEBUG os_vif [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:d1:d3,bridge_name='br-int',has_traffic_filtering=True,id=7cc4c824-7161-48a1-88e5-a3cc6e77170b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cc4c824-71') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.176 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7cc4c824-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.182 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=5e65d6a9-b5ca-457e-9e4e-1b2792430aa3) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.187 2 INFO os_vif [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:d1:d3,bridge_name='br-int',has_traffic_filtering=True,id=7cc4c824-7161-48a1-88e5-a3cc6e77170b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7cc4c824-71')
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.188 2 INFO nova.virt.libvirt.driver [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Deleting instance files /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52_del
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.189 2 INFO nova.virt.libvirt.driver [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Deletion of /var/lib/nova/instances/40adb852-2234-400a-bacc-bbe8ecad4a52_del complete
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.437 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.438 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.702 2 INFO nova.compute.manager [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Took 1.34 seconds to destroy the instance on the hypervisor.
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.702 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.703 2 DEBUG nova.compute.manager [-] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.703 2 DEBUG nova.network.neutron [-] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.704 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.948 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.949 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:25:09 compute-0 nova_compute[192903]: 2025-10-06 14:25:09.949 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:25:10 compute-0 nova_compute[192903]: 2025-10-06 14:25:10.454 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:25:10 compute-0 nova_compute[192903]: 2025-10-06 14:25:10.559 2 DEBUG nova.compute.manager [req-b761609b-21e2-49bf-b18e-59a47d2f47e8 req-574682c5-256e-411f-a6a1-985498b69478 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Received event network-vif-unplugged-7cc4c824-7161-48a1-88e5-a3cc6e77170b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:25:10 compute-0 nova_compute[192903]: 2025-10-06 14:25:10.559 2 DEBUG oslo_concurrency.lockutils [req-b761609b-21e2-49bf-b18e-59a47d2f47e8 req-574682c5-256e-411f-a6a1-985498b69478 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "40adb852-2234-400a-bacc-bbe8ecad4a52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:10 compute-0 nova_compute[192903]: 2025-10-06 14:25:10.559 2 DEBUG oslo_concurrency.lockutils [req-b761609b-21e2-49bf-b18e-59a47d2f47e8 req-574682c5-256e-411f-a6a1-985498b69478 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "40adb852-2234-400a-bacc-bbe8ecad4a52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:10 compute-0 nova_compute[192903]: 2025-10-06 14:25:10.559 2 DEBUG oslo_concurrency.lockutils [req-b761609b-21e2-49bf-b18e-59a47d2f47e8 req-574682c5-256e-411f-a6a1-985498b69478 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "40adb852-2234-400a-bacc-bbe8ecad4a52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:10 compute-0 nova_compute[192903]: 2025-10-06 14:25:10.560 2 DEBUG nova.compute.manager [req-b761609b-21e2-49bf-b18e-59a47d2f47e8 req-574682c5-256e-411f-a6a1-985498b69478 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] No waiting events found dispatching network-vif-unplugged-7cc4c824-7161-48a1-88e5-a3cc6e77170b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:25:10 compute-0 nova_compute[192903]: 2025-10-06 14:25:10.560 2 DEBUG nova.compute.manager [req-b761609b-21e2-49bf-b18e-59a47d2f47e8 req-574682c5-256e-411f-a6a1-985498b69478 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Received event network-vif-unplugged-7cc4c824-7161-48a1-88e5-a3cc6e77170b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:25:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:11.396 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:11.396 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:11.397 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:11 compute-0 nova_compute[192903]: 2025-10-06 14:25:11.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:25:12 compute-0 nova_compute[192903]: 2025-10-06 14:25:12.231 2 DEBUG nova.network.neutron [-] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:25:12 compute-0 nova_compute[192903]: 2025-10-06 14:25:12.741 2 INFO nova.compute.manager [-] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Took 3.04 seconds to deallocate network for instance.
Oct 06 14:25:12 compute-0 nova_compute[192903]: 2025-10-06 14:25:12.764 2 DEBUG nova.compute.manager [req-90943dc2-5f03-44ae-a50a-de0f32cb107f req-a7004dd3-25c8-4ecc-8e86-d2fe8f88e3a0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Received event network-vif-unplugged-7cc4c824-7161-48a1-88e5-a3cc6e77170b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:25:12 compute-0 nova_compute[192903]: 2025-10-06 14:25:12.765 2 DEBUG oslo_concurrency.lockutils [req-90943dc2-5f03-44ae-a50a-de0f32cb107f req-a7004dd3-25c8-4ecc-8e86-d2fe8f88e3a0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "40adb852-2234-400a-bacc-bbe8ecad4a52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:12 compute-0 nova_compute[192903]: 2025-10-06 14:25:12.765 2 DEBUG oslo_concurrency.lockutils [req-90943dc2-5f03-44ae-a50a-de0f32cb107f req-a7004dd3-25c8-4ecc-8e86-d2fe8f88e3a0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "40adb852-2234-400a-bacc-bbe8ecad4a52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:12 compute-0 nova_compute[192903]: 2025-10-06 14:25:12.766 2 DEBUG oslo_concurrency.lockutils [req-90943dc2-5f03-44ae-a50a-de0f32cb107f req-a7004dd3-25c8-4ecc-8e86-d2fe8f88e3a0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "40adb852-2234-400a-bacc-bbe8ecad4a52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:12 compute-0 nova_compute[192903]: 2025-10-06 14:25:12.766 2 DEBUG nova.compute.manager [req-90943dc2-5f03-44ae-a50a-de0f32cb107f req-a7004dd3-25c8-4ecc-8e86-d2fe8f88e3a0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] No waiting events found dispatching network-vif-unplugged-7cc4c824-7161-48a1-88e5-a3cc6e77170b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:25:12 compute-0 nova_compute[192903]: 2025-10-06 14:25:12.767 2 DEBUG nova.compute.manager [req-90943dc2-5f03-44ae-a50a-de0f32cb107f req-a7004dd3-25c8-4ecc-8e86-d2fe8f88e3a0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Received event network-vif-unplugged-7cc4c824-7161-48a1-88e5-a3cc6e77170b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:25:12 compute-0 nova_compute[192903]: 2025-10-06 14:25:12.767 2 DEBUG nova.compute.manager [req-90943dc2-5f03-44ae-a50a-de0f32cb107f req-a7004dd3-25c8-4ecc-8e86-d2fe8f88e3a0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 40adb852-2234-400a-bacc-bbe8ecad4a52] Received event network-vif-deleted-7cc4c824-7161-48a1-88e5-a3cc6e77170b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:25:13 compute-0 nova_compute[192903]: 2025-10-06 14:25:13.280 2 DEBUG oslo_concurrency.lockutils [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:13 compute-0 nova_compute[192903]: 2025-10-06 14:25:13.281 2 DEBUG oslo_concurrency.lockutils [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:13 compute-0 nova_compute[192903]: 2025-10-06 14:25:13.286 2 DEBUG oslo_concurrency.lockutils [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:13 compute-0 nova_compute[192903]: 2025-10-06 14:25:13.331 2 INFO nova.scheduler.client.report [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Deleted allocations for instance 40adb852-2234-400a-bacc-bbe8ecad4a52
Oct 06 14:25:14 compute-0 nova_compute[192903]: 2025-10-06 14:25:14.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:14 compute-0 nova_compute[192903]: 2025-10-06 14:25:14.359 2 DEBUG oslo_concurrency.lockutils [None req-4196cb3d-47de-46bf-a394-1e8cdc656d92 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "40adb852-2234-400a-bacc-bbe8ecad4a52" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.532s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:14 compute-0 nova_compute[192903]: 2025-10-06 14:25:14.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:25:14 compute-0 nova_compute[192903]: 2025-10-06 14:25:14.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.102 2 DEBUG oslo_concurrency.lockutils [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "91e3d012-6b97-466e-a069-c3e4424d2f67" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.102 2 DEBUG oslo_concurrency.lockutils [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "91e3d012-6b97-466e-a069-c3e4424d2f67" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.103 2 DEBUG oslo_concurrency.lockutils [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "91e3d012-6b97-466e-a069-c3e4424d2f67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.103 2 DEBUG oslo_concurrency.lockutils [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "91e3d012-6b97-466e-a069-c3e4424d2f67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.104 2 DEBUG oslo_concurrency.lockutils [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "91e3d012-6b97-466e-a069-c3e4424d2f67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.121 2 INFO nova.compute.manager [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Terminating instance
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.637 2 DEBUG nova.compute.manager [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:25:15 compute-0 kernel: tap65f56f13-1c (unregistering): left promiscuous mode
Oct 06 14:25:15 compute-0 NetworkManager[52035]: <info>  [1759760715.6627] device (tap65f56f13-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:15 compute-0 ovn_controller[95205]: 2025-10-06T14:25:15Z|00233|binding|INFO|Releasing lport 65f56f13-1ccd-4101-aa20-05f1634ca2df from this chassis (sb_readonly=0)
Oct 06 14:25:15 compute-0 ovn_controller[95205]: 2025-10-06T14:25:15Z|00234|binding|INFO|Setting lport 65f56f13-1ccd-4101-aa20-05f1634ca2df down in Southbound
Oct 06 14:25:15 compute-0 ovn_controller[95205]: 2025-10-06T14:25:15Z|00235|binding|INFO|Removing iface tap65f56f13-1c ovn-installed in OVS
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.684 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:46:b4 10.100.0.11'], port_security=['fa:16:3e:26:46:b4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '91e3d012-6b97-466e-a069-c3e4424d2f67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=65f56f13-1ccd-4101-aa20-05f1634ca2df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.685 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 65f56f13-1ccd-4101-aa20-05f1634ca2df in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.686 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.688 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5edc829a-3c96-408b-a163-9db10b6fe1fb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.688 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace which is not needed anymore
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:15 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000018.scope: Deactivated successfully.
Oct 06 14:25:15 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000018.scope: Consumed 4.120s CPU time.
Oct 06 14:25:15 compute-0 systemd-machined[152985]: Machine qemu-19-instance-00000018 terminated.
Oct 06 14:25:15 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[225964]: [NOTICE]   (225968) : haproxy version is 3.0.5-8e879a5
Oct 06 14:25:15 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[225964]: [NOTICE]   (225968) : path to executable is /usr/sbin/haproxy
Oct 06 14:25:15 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[225964]: [WARNING]  (225968) : Exiting Master process...
Oct 06 14:25:15 compute-0 podman[226304]: 2025-10-06 14:25:15.838079183 +0000 UTC m=+0.042082956 container kill e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 06 14:25:15 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[225964]: [ALERT]    (225968) : Current worker (225970) exited with code 143 (Terminated)
Oct 06 14:25:15 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[225964]: [WARNING]  (225968) : All workers exited. Exiting... (0)
Oct 06 14:25:15 compute-0 systemd[1]: libpod-e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a.scope: Deactivated successfully.
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.852 2 DEBUG nova.compute.manager [req-1360c237-f7f0-4029-bd4f-bd592133d077 req-b6524b1b-e0a7-4e01-ab01-59760b954ecd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Received event network-vif-unplugged-65f56f13-1ccd-4101-aa20-05f1634ca2df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.853 2 DEBUG oslo_concurrency.lockutils [req-1360c237-f7f0-4029-bd4f-bd592133d077 req-b6524b1b-e0a7-4e01-ab01-59760b954ecd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "91e3d012-6b97-466e-a069-c3e4424d2f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.854 2 DEBUG oslo_concurrency.lockutils [req-1360c237-f7f0-4029-bd4f-bd592133d077 req-b6524b1b-e0a7-4e01-ab01-59760b954ecd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "91e3d012-6b97-466e-a069-c3e4424d2f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.854 2 DEBUG oslo_concurrency.lockutils [req-1360c237-f7f0-4029-bd4f-bd592133d077 req-b6524b1b-e0a7-4e01-ab01-59760b954ecd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "91e3d012-6b97-466e-a069-c3e4424d2f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.855 2 DEBUG nova.compute.manager [req-1360c237-f7f0-4029-bd4f-bd592133d077 req-b6524b1b-e0a7-4e01-ab01-59760b954ecd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] No waiting events found dispatching network-vif-unplugged-65f56f13-1ccd-4101-aa20-05f1634ca2df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.855 2 DEBUG nova.compute.manager [req-1360c237-f7f0-4029-bd4f-bd592133d077 req-b6524b1b-e0a7-4e01-ab01-59760b954ecd e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Received event network-vif-unplugged-65f56f13-1ccd-4101-aa20-05f1634ca2df for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:25:15 compute-0 podman[226319]: 2025-10-06 14:25:15.897999366 +0000 UTC m=+0.038409149 container died e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.909 2 INFO nova.virt.libvirt.driver [-] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Instance destroyed successfully.
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.910 2 DEBUG nova.objects.instance [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'resources' on Instance uuid 91e3d012-6b97-466e-a069-c3e4424d2f67 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:25:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a-userdata-shm.mount: Deactivated successfully.
Oct 06 14:25:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-d6961321e90d65a514ae7775959adb6aed18a18864b37d4f5ec18559c94f164e-merged.mount: Deactivated successfully.
Oct 06 14:25:15 compute-0 podman[226319]: 2025-10-06 14:25:15.9518486 +0000 UTC m=+0.092258393 container cleanup e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 06 14:25:15 compute-0 systemd[1]: libpod-conmon-e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a.scope: Deactivated successfully.
Oct 06 14:25:15 compute-0 podman[226326]: 2025-10-06 14:25:15.97429082 +0000 UTC m=+0.096866435 container remove e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.982 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f5d2ac-5026-4b9d-9cef-c0d74a5154c7]: (4, ("Mon Oct  6 02:25:15 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a)\ne8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a\nMon Oct  6 02:25:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (e8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a)\ne8c31fd5d1b79bc6fd9acd542447c665b2cdf33fa056598e475974e17d4b097a\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.984 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[607ed25e-9c13-41b1-aba9-b78cca2accaf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.984 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.985 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3644aa32-2bec-4e44-bed3-1118aec69f39]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:15.985 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:25:15 compute-0 nova_compute[192903]: 2025-10-06 14:25:15.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:15 compute-0 kernel: tap55ccf1b2-d0: left promiscuous mode
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:16.017 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6114bf79-eeb5-4f34-9161-2abedab72873]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:16.043 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[24ae1b71-4cba-490d-85eb-ada82f64ef75]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:16.044 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[41f1f731-c75d-4b0e-b20d-41247456c1f3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:16.062 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6c3ff5e8-b4e4-44c0-b888-5e2c465097d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512352, 'reachable_time': 17490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226371, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:16.065 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:25:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:16.065 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[c243eb91-a46c-41dd-b03a-8152f977c1bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:25:16 compute-0 systemd[1]: run-netns-ovnmeta\x2d55ccf1b2\x2dd24e\x2d4063\x2db15b\x2d60a65227d75e.mount: Deactivated successfully.
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.417 2 DEBUG nova.virt.libvirt.vif [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:22:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1149461266',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1149461266',id=24,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:23:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-c006e5sm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:24:34Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=91e3d012-6b97-466e-a069-c3e4424d2f67,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "address": "fa:16:3e:26:46:b4", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f56f13-1c", "ovs_interfaceid": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.418 2 DEBUG nova.network.os_vif_util [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "address": "fa:16:3e:26:46:b4", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f56f13-1c", "ovs_interfaceid": "65f56f13-1ccd-4101-aa20-05f1634ca2df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.420 2 DEBUG nova.network.os_vif_util [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:46:b4,bridge_name='br-int',has_traffic_filtering=True,id=65f56f13-1ccd-4101-aa20-05f1634ca2df,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f56f13-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.421 2 DEBUG os_vif [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:46:b4,bridge_name='br-int',has_traffic_filtering=True,id=65f56f13-1ccd-4101-aa20-05f1634ca2df,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f56f13-1c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.427 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65f56f13-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.433 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=6ed53db1-dfbf-42a9-834f-7c167051bebd) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.439 2 INFO os_vif [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:46:b4,bridge_name='br-int',has_traffic_filtering=True,id=65f56f13-1ccd-4101-aa20-05f1634ca2df,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f56f13-1c')
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.440 2 INFO nova.virt.libvirt.driver [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Deleting instance files /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67_del
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.441 2 INFO nova.virt.libvirt.driver [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Deletion of /var/lib/nova/instances/91e3d012-6b97-466e-a069-c3e4424d2f67_del complete
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.954 2 INFO nova.compute.manager [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.955 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.956 2 DEBUG nova.compute.manager [-] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.956 2 DEBUG nova.network.neutron [-] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:25:16 compute-0 nova_compute[192903]: 2025-10-06 14:25:16.957 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.382 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.923 2 DEBUG nova.compute.manager [req-aec1d0bb-363a-4155-a661-a05c4222bd29 req-74d46a86-08c8-438f-bc38-7281cd64a3f8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Received event network-vif-unplugged-65f56f13-1ccd-4101-aa20-05f1634ca2df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.924 2 DEBUG oslo_concurrency.lockutils [req-aec1d0bb-363a-4155-a661-a05c4222bd29 req-74d46a86-08c8-438f-bc38-7281cd64a3f8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "91e3d012-6b97-466e-a069-c3e4424d2f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.924 2 DEBUG oslo_concurrency.lockutils [req-aec1d0bb-363a-4155-a661-a05c4222bd29 req-74d46a86-08c8-438f-bc38-7281cd64a3f8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "91e3d012-6b97-466e-a069-c3e4424d2f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.925 2 DEBUG oslo_concurrency.lockutils [req-aec1d0bb-363a-4155-a661-a05c4222bd29 req-74d46a86-08c8-438f-bc38-7281cd64a3f8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "91e3d012-6b97-466e-a069-c3e4424d2f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.926 2 DEBUG nova.compute.manager [req-aec1d0bb-363a-4155-a661-a05c4222bd29 req-74d46a86-08c8-438f-bc38-7281cd64a3f8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] No waiting events found dispatching network-vif-unplugged-65f56f13-1ccd-4101-aa20-05f1634ca2df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.926 2 DEBUG nova.compute.manager [req-aec1d0bb-363a-4155-a661-a05c4222bd29 req-74d46a86-08c8-438f-bc38-7281cd64a3f8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Received event network-vif-unplugged-65f56f13-1ccd-4101-aa20-05f1634ca2df for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.966 2 DEBUG nova.compute.manager [req-8d9cdadf-6149-436f-ab3f-e2a8fe8cdf05 req-21ff48a1-d54f-45d3-bc23-83e1f0d4491a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Received event network-vif-deleted-65f56f13-1ccd-4101-aa20-05f1634ca2df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.967 2 INFO nova.compute.manager [req-8d9cdadf-6149-436f-ab3f-e2a8fe8cdf05 req-21ff48a1-d54f-45d3-bc23-83e1f0d4491a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Neutron deleted interface 65f56f13-1ccd-4101-aa20-05f1634ca2df; detaching it from the instance and deleting it from the info cache
Oct 06 14:25:17 compute-0 nova_compute[192903]: 2025-10-06 14:25:17.968 2 DEBUG nova.network.neutron [req-8d9cdadf-6149-436f-ab3f-e2a8fe8cdf05 req-21ff48a1-d54f-45d3-bc23-83e1f0d4491a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:25:18 compute-0 nova_compute[192903]: 2025-10-06 14:25:18.391 2 DEBUG nova.network.neutron [-] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:25:18 compute-0 nova_compute[192903]: 2025-10-06 14:25:18.477 2 DEBUG nova.compute.manager [req-8d9cdadf-6149-436f-ab3f-e2a8fe8cdf05 req-21ff48a1-d54f-45d3-bc23-83e1f0d4491a e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Detach interface failed, port_id=65f56f13-1ccd-4101-aa20-05f1634ca2df, reason: Instance 91e3d012-6b97-466e-a069-c3e4424d2f67 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:25:18 compute-0 nova_compute[192903]: 2025-10-06 14:25:18.898 2 INFO nova.compute.manager [-] [instance: 91e3d012-6b97-466e-a069-c3e4424d2f67] Took 1.94 seconds to deallocate network for instance.
Oct 06 14:25:19 compute-0 nova_compute[192903]: 2025-10-06 14:25:19.422 2 DEBUG oslo_concurrency.lockutils [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:25:19 compute-0 nova_compute[192903]: 2025-10-06 14:25:19.425 2 DEBUG oslo_concurrency.lockutils [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:25:19 compute-0 nova_compute[192903]: 2025-10-06 14:25:19.481 2 DEBUG nova.compute.provider_tree [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:25:19 compute-0 nova_compute[192903]: 2025-10-06 14:25:19.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:19 compute-0 nova_compute[192903]: 2025-10-06 14:25:19.989 2 DEBUG nova.scheduler.client.report [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:25:20 compute-0 nova_compute[192903]: 2025-10-06 14:25:20.502 2 DEBUG oslo_concurrency.lockutils [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.077s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:20 compute-0 nova_compute[192903]: 2025-10-06 14:25:20.521 2 INFO nova.scheduler.client.report [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Deleted allocations for instance 91e3d012-6b97-466e-a069-c3e4424d2f67
Oct 06 14:25:21 compute-0 podman[226374]: 2025-10-06 14:25:21.23669537 +0000 UTC m=+0.083226317 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Oct 06 14:25:21 compute-0 podman[226375]: 2025-10-06 14:25:21.246034895 +0000 UTC m=+0.089737628 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:25:21 compute-0 podman[226373]: 2025-10-06 14:25:21.248485509 +0000 UTC m=+0.099118684 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct 06 14:25:21 compute-0 podman[226372]: 2025-10-06 14:25:21.281298421 +0000 UTC m=+0.133581059 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:25:21 compute-0 nova_compute[192903]: 2025-10-06 14:25:21.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:21 compute-0 nova_compute[192903]: 2025-10-06 14:25:21.554 2 DEBUG oslo_concurrency.lockutils [None req-5db7b42b-6de8-4b52-a237-da96ccf46987 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "91e3d012-6b97-466e-a069-c3e4424d2f67" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.451s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:25:24 compute-0 nova_compute[192903]: 2025-10-06 14:25:24.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:26 compute-0 nova_compute[192903]: 2025-10-06 14:25:26.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:29 compute-0 nova_compute[192903]: 2025-10-06 14:25:29.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:29 compute-0 podman[203308]: time="2025-10-06T14:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:25:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:25:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 06 14:25:31 compute-0 openstack_network_exporter[205500]: ERROR   14:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:25:31 compute-0 openstack_network_exporter[205500]: ERROR   14:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:25:31 compute-0 openstack_network_exporter[205500]: ERROR   14:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:25:31 compute-0 openstack_network_exporter[205500]: ERROR   14:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:25:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:25:31 compute-0 openstack_network_exporter[205500]: ERROR   14:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:25:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:25:31 compute-0 nova_compute[192903]: 2025-10-06 14:25:31.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:33 compute-0 podman[226460]: 2025-10-06 14:25:33.21311577 +0000 UTC m=+0.079734765 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 14:25:34 compute-0 nova_compute[192903]: 2025-10-06 14:25:34.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:36 compute-0 podman[226480]: 2025-10-06 14:25:36.22646603 +0000 UTC m=+0.083390100 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm)
Oct 06 14:25:36 compute-0 nova_compute[192903]: 2025-10-06 14:25:36.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:39 compute-0 nova_compute[192903]: 2025-10-06 14:25:39.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:41 compute-0 nova_compute[192903]: 2025-10-06 14:25:41.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:44 compute-0 nova_compute[192903]: 2025-10-06 14:25:44.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:46 compute-0 nova_compute[192903]: 2025-10-06 14:25:46.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:49 compute-0 nova_compute[192903]: 2025-10-06 14:25:49.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:51 compute-0 nova_compute[192903]: 2025-10-06 14:25:51.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:52 compute-0 podman[226503]: 2025-10-06 14:25:52.191361727 +0000 UTC m=+0.050708293 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 06 14:25:52 compute-0 podman[226504]: 2025-10-06 14:25:52.209148554 +0000 UTC m=+0.057953203 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 06 14:25:52 compute-0 podman[226502]: 2025-10-06 14:25:52.215918212 +0000 UTC m=+0.079592891 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 06 14:25:52 compute-0 podman[226512]: 2025-10-06 14:25:52.229873368 +0000 UTC m=+0.078824551 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:25:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:54.386 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:25:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:54.387 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:25:54 compute-0 nova_compute[192903]: 2025-10-06 14:25:54.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:54 compute-0 nova_compute[192903]: 2025-10-06 14:25:54.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:25:56.389 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:25:56 compute-0 nova_compute[192903]: 2025-10-06 14:25:56.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:59 compute-0 nova_compute[192903]: 2025-10-06 14:25:59.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:25:59 compute-0 podman[203308]: time="2025-10-06T14:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:25:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:25:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Oct 06 14:26:00 compute-0 nova_compute[192903]: 2025-10-06 14:26:00.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:26:01 compute-0 openstack_network_exporter[205500]: ERROR   14:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:26:01 compute-0 openstack_network_exporter[205500]: ERROR   14:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:26:01 compute-0 openstack_network_exporter[205500]: ERROR   14:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:26:01 compute-0 openstack_network_exporter[205500]: ERROR   14:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:26:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:26:01 compute-0 openstack_network_exporter[205500]: ERROR   14:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:26:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:26:01 compute-0 nova_compute[192903]: 2025-10-06 14:26:01.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:01 compute-0 nova_compute[192903]: 2025-10-06 14:26:01.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.098 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.273 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.274 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.296 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.297 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5855MB free_disk=73.30014038085938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.297 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:26:02 compute-0 nova_compute[192903]: 2025-10-06 14:26:02.297 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:26:03 compute-0 nova_compute[192903]: 2025-10-06 14:26:03.345 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:26:03 compute-0 nova_compute[192903]: 2025-10-06 14:26:03.346 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:26:02 up  1:27,  0 user,  load average: 0.07, 0.19, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:26:03 compute-0 nova_compute[192903]: 2025-10-06 14:26:03.425 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:26:03 compute-0 nova_compute[192903]: 2025-10-06 14:26:03.932 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:26:04 compute-0 podman[226589]: 2025-10-06 14:26:04.0266144 +0000 UTC m=+0.070627579 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid)
Oct 06 14:26:04 compute-0 nova_compute[192903]: 2025-10-06 14:26:04.441 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:26:04 compute-0 nova_compute[192903]: 2025-10-06 14:26:04.441 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.144s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:26:04 compute-0 nova_compute[192903]: 2025-10-06 14:26:04.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:06 compute-0 nova_compute[192903]: 2025-10-06 14:26:06.438 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:26:06 compute-0 nova_compute[192903]: 2025-10-06 14:26:06.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:07 compute-0 podman[226607]: 2025-10-06 14:26:07.191010954 +0000 UTC m=+0.051199149 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 06 14:26:07 compute-0 nova_compute[192903]: 2025-10-06 14:26:07.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:26:07 compute-0 nova_compute[192903]: 2025-10-06 14:26:07.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:26:07 compute-0 nova_compute[192903]: 2025-10-06 14:26:07.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:26:09 compute-0 nova_compute[192903]: 2025-10-06 14:26:09.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:11.399 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:26:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:11.399 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:26:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:11.399 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:26:11 compute-0 nova_compute[192903]: 2025-10-06 14:26:11.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:12 compute-0 nova_compute[192903]: 2025-10-06 14:26:12.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:26:14 compute-0 nova_compute[192903]: 2025-10-06 14:26:14.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:15 compute-0 nova_compute[192903]: 2025-10-06 14:26:15.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:26:16 compute-0 nova_compute[192903]: 2025-10-06 14:26:16.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:17 compute-0 nova_compute[192903]: 2025-10-06 14:26:17.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:26:19 compute-0 nova_compute[192903]: 2025-10-06 14:26:19.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:21 compute-0 nova_compute[192903]: 2025-10-06 14:26:21.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:23 compute-0 podman[226631]: 2025-10-06 14:26:23.197897957 +0000 UTC m=+0.055657450 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 06 14:26:23 compute-0 podman[226630]: 2025-10-06 14:26:23.203387847 +0000 UTC m=+0.066038144 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:26:23 compute-0 podman[226632]: 2025-10-06 14:26:23.230096746 +0000 UTC m=+0.079457450 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:26:23 compute-0 podman[226629]: 2025-10-06 14:26:23.246661078 +0000 UTC m=+0.106553020 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=watcher_latest)
Oct 06 14:26:24 compute-0 nova_compute[192903]: 2025-10-06 14:26:24.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:26 compute-0 nova_compute[192903]: 2025-10-06 14:26:26.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:29 compute-0 nova_compute[192903]: 2025-10-06 14:26:29.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:29 compute-0 podman[203308]: time="2025-10-06T14:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:26:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:26:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 06 14:26:31 compute-0 openstack_network_exporter[205500]: ERROR   14:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:26:31 compute-0 openstack_network_exporter[205500]: ERROR   14:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:26:31 compute-0 openstack_network_exporter[205500]: ERROR   14:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:26:31 compute-0 openstack_network_exporter[205500]: ERROR   14:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:26:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:26:31 compute-0 openstack_network_exporter[205500]: ERROR   14:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:26:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:26:31 compute-0 nova_compute[192903]: 2025-10-06 14:26:31.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:34 compute-0 podman[226718]: 2025-10-06 14:26:34.231734636 +0000 UTC m=+0.083141781 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:26:34 compute-0 ovn_controller[95205]: 2025-10-06T14:26:34Z|00236|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 06 14:26:34 compute-0 nova_compute[192903]: 2025-10-06 14:26:34.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:36 compute-0 nova_compute[192903]: 2025-10-06 14:26:36.053 2 DEBUG nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Creating tmpfile /var/lib/nova/instances/tmpkltwqzq_ to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:26:36 compute-0 nova_compute[192903]: 2025-10-06 14:26:36.054 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:36 compute-0 nova_compute[192903]: 2025-10-06 14:26:36.059 2 DEBUG nova.compute.manager [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkltwqzq_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:26:36 compute-0 nova_compute[192903]: 2025-10-06 14:26:36.233 2 DEBUG nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Creating tmpfile /var/lib/nova/instances/tmps1o3bez_ to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:26:36 compute-0 nova_compute[192903]: 2025-10-06 14:26:36.234 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:36 compute-0 nova_compute[192903]: 2025-10-06 14:26:36.237 2 DEBUG nova.compute.manager [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps1o3bez_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:26:36 compute-0 nova_compute[192903]: 2025-10-06 14:26:36.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:38 compute-0 nova_compute[192903]: 2025-10-06 14:26:38.101 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:38 compute-0 podman[226738]: 2025-10-06 14:26:38.242173202 +0000 UTC m=+0.098087579 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350)
Oct 06 14:26:38 compute-0 nova_compute[192903]: 2025-10-06 14:26:38.266 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:39 compute-0 nova_compute[192903]: 2025-10-06 14:26:39.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:41 compute-0 nova_compute[192903]: 2025-10-06 14:26:41.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:42 compute-0 nova_compute[192903]: 2025-10-06 14:26:42.310 2 DEBUG nova.compute.manager [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps1o3bez_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b5e5a9ae-529f-46a2-9288-091ff73bf07f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:26:43 compute-0 nova_compute[192903]: 2025-10-06 14:26:43.323 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-b5e5a9ae-529f-46a2-9288-091ff73bf07f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:26:43 compute-0 nova_compute[192903]: 2025-10-06 14:26:43.324 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-b5e5a9ae-529f-46a2-9288-091ff73bf07f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:26:43 compute-0 nova_compute[192903]: 2025-10-06 14:26:43.324 2 DEBUG nova.network.neutron [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:26:43 compute-0 nova_compute[192903]: 2025-10-06 14:26:43.831 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:44 compute-0 nova_compute[192903]: 2025-10-06 14:26:44.558 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:44 compute-0 nova_compute[192903]: 2025-10-06 14:26:44.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:44 compute-0 nova_compute[192903]: 2025-10-06 14:26:44.736 2 DEBUG nova.network.neutron [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Updating instance_info_cache with network_info: [{"id": "3d929184-7221-43be-9263-037768dac50b", "address": "fa:16:3e:a7:5e:28", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d929184-72", "ovs_interfaceid": "3d929184-7221-43be-9263-037768dac50b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.243 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-b5e5a9ae-529f-46a2-9288-091ff73bf07f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.260 2 DEBUG nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps1o3bez_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b5e5a9ae-529f-46a2-9288-091ff73bf07f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.260 2 DEBUG nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Creating instance directory: /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.261 2 DEBUG nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Creating disk.info with the contents: {'/var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk': 'qcow2', '/var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.261 2 DEBUG nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.262 2 DEBUG nova.objects.instance [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid b5e5a9ae-529f-46a2-9288-091ff73bf07f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.768 2 DEBUG oslo_utils.imageutils.format_inspector [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.776 2 DEBUG oslo_utils.imageutils.format_inspector [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.778 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.869 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.870 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.871 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.872 2 DEBUG oslo_utils.imageutils.format_inspector [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.878 2 DEBUG oslo_utils.imageutils.format_inspector [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.879 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.946 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.947 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.984 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.985 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:26:45 compute-0 nova_compute[192903]: 2025-10-06 14:26:45.986 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.040 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.043 2 DEBUG nova.virt.disk.api [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.044 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.105 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.106 2 DEBUG nova.virt.disk.api [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.111 2 DEBUG nova.objects.instance [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid b5e5a9ae-529f-46a2-9288-091ff73bf07f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.621 2 DEBUG nova.objects.base [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<b5e5a9ae-529f-46a2-9288-091ff73bf07f> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.622 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.655 2 DEBUG oslo_concurrency.processutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk.config 497664" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.657 2 DEBUG nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.659 2 DEBUG nova.virt.libvirt.vif [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:25:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-310983494',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-310983494',id=27,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:26:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-hk300dlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:26:06Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=b5e5a9ae-529f-46a2-9288-091ff73bf07f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3d929184-7221-43be-9263-037768dac50b", "address": "fa:16:3e:a7:5e:28", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3d929184-72", "ovs_interfaceid": "3d929184-7221-43be-9263-037768dac50b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.660 2 DEBUG nova.network.os_vif_util [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "3d929184-7221-43be-9263-037768dac50b", "address": "fa:16:3e:a7:5e:28", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3d929184-72", "ovs_interfaceid": "3d929184-7221-43be-9263-037768dac50b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.661 2 DEBUG nova.network.os_vif_util [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:5e:28,bridge_name='br-int',has_traffic_filtering=True,id=3d929184-7221-43be-9263-037768dac50b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d929184-72') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.662 2 DEBUG os_vif [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:5e:28,bridge_name='br-int',has_traffic_filtering=True,id=3d929184-7221-43be-9263-037768dac50b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d929184-72') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.664 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.665 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.667 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a0f393cb-0d1e-5110-8685-e130a3ab595b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.687 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d929184-72, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.687 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3d929184-72, col_values=(('qos', UUID('21c92a1d-7fd9-4af2-adff-815f428c135f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.688 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3d929184-72, col_values=(('external_ids', {'iface-id': '3d929184-7221-43be-9263-037768dac50b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:5e:28', 'vm-uuid': 'b5e5a9ae-529f-46a2-9288-091ff73bf07f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:46 compute-0 NetworkManager[52035]: <info>  [1759760806.6912] manager: (tap3d929184-72): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.697 2 INFO os_vif [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:5e:28,bridge_name='br-int',has_traffic_filtering=True,id=3d929184-7221-43be-9263-037768dac50b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d929184-72')
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.697 2 DEBUG nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.697 2 DEBUG nova.compute.manager [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps1o3bez_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b5e5a9ae-529f-46a2-9288-091ff73bf07f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.698 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:46 compute-0 nova_compute[192903]: 2025-10-06 14:26:46.760 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:47 compute-0 nova_compute[192903]: 2025-10-06 14:26:47.755 2 DEBUG nova.network.neutron [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Port 3d929184-7221-43be-9263-037768dac50b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:26:47 compute-0 nova_compute[192903]: 2025-10-06 14:26:47.771 2 DEBUG nova.compute.manager [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps1o3bez_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b5e5a9ae-529f-46a2-9288-091ff73bf07f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:26:49 compute-0 nova_compute[192903]: 2025-10-06 14:26:49.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:51 compute-0 kernel: tap3d929184-72: entered promiscuous mode
Oct 06 14:26:51 compute-0 NetworkManager[52035]: <info>  [1759760811.3401] manager: (tap3d929184-72): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Oct 06 14:26:51 compute-0 systemd-udevd[226792]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:26:51 compute-0 ovn_controller[95205]: 2025-10-06T14:26:51Z|00237|binding|INFO|Claiming lport 3d929184-7221-43be-9263-037768dac50b for this additional chassis.
Oct 06 14:26:51 compute-0 ovn_controller[95205]: 2025-10-06T14:26:51Z|00238|binding|INFO|3d929184-7221-43be-9263-037768dac50b: Claiming fa:16:3e:a7:5e:28 10.100.0.11
Oct 06 14:26:51 compute-0 nova_compute[192903]: 2025-10-06 14:26:51.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.406 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:5e:28 10.100.0.11'], port_security=['fa:16:3e:a7:5e:28 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b5e5a9ae-529f-46a2-9288-091ff73bf07f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=3d929184-7221-43be-9263-037768dac50b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.408 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 3d929184-7221-43be-9263-037768dac50b in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.410 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:26:51 compute-0 ovn_controller[95205]: 2025-10-06T14:26:51Z|00239|binding|INFO|Setting lport 3d929184-7221-43be-9263-037768dac50b ovn-installed in OVS
Oct 06 14:26:51 compute-0 NetworkManager[52035]: <info>  [1759760811.4197] device (tap3d929184-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:26:51 compute-0 nova_compute[192903]: 2025-10-06 14:26:51.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:51 compute-0 NetworkManager[52035]: <info>  [1759760811.4206] device (tap3d929184-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.432 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c9ac18-3ecb-4665-a2f1-e3917465596a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.433 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55ccf1b2-d1 in ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:26:51 compute-0 systemd-machined[152985]: New machine qemu-21-instance-0000001b.
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.436 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55ccf1b2-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.436 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3d1532c6-2c64-43fb-a7df-1222fc1d21db]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.438 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[99ff553d-b622-4cfd-8521-dad270503522]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001b.
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.455 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[3568ae33-d0ec-4b32-8e57-986b4babe668]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.474 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bd530c68-3069-4864-bab2-72d711455a4b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.521 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[d63b748a-5201-4f9d-ac5d-d8966c7e9455]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 14:26:51 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.530 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4479e6-3381-4f96-9761-bac90419164f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 NetworkManager[52035]: <info>  [1759760811.5318] manager: (tap55ccf1b2-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/86)
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.584 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[71480c77-cac9-4c03-833b-91a8d348f9ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.587 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc131ee-f488-4282-8cbd-801f7b6cd724]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 NetworkManager[52035]: <info>  [1759760811.6188] device (tap55ccf1b2-d0): carrier: link connected
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.627 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[b7e35e57-78a7-4c8b-96b8-52319dba34f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.653 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8075177f-59c4-4ab8-900a-5be7f63c9a21]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527326, 'reachable_time': 42908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226829, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.676 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3989dbfd-a4cf-4663-a182-de7bff057b2a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:aab9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527326, 'tstamp': 527326}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226830, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 nova_compute[192903]: 2025-10-06 14:26:51.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.701 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c59317f7-8c70-444d-935c-633609598e54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527326, 'reachable_time': 42908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226831, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.741 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5dada2c2-4cfe-45ec-a1ad-d2b93af7ae85]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.825 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2300d2f4-2f1c-434d-b43b-d9777ebd06df]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.827 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.827 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.829 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:26:51 compute-0 nova_compute[192903]: 2025-10-06 14:26:51.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:51 compute-0 NetworkManager[52035]: <info>  [1759760811.8316] manager: (tap55ccf1b2-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Oct 06 14:26:51 compute-0 kernel: tap55ccf1b2-d0: entered promiscuous mode
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.833 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:26:51 compute-0 nova_compute[192903]: 2025-10-06 14:26:51.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:51 compute-0 ovn_controller[95205]: 2025-10-06T14:26:51Z|00240|binding|INFO|Releasing lport 0ee47753-a40c-4a21-a6ed-65093b6727d9 from this chassis (sb_readonly=0)
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.849 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ad962078-ced2-4d7d-9775-e65e2b57f898]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.850 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.850 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.850 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 55ccf1b2-d24e-4063-b15b-60a65227d75e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.850 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:26:51 compute-0 nova_compute[192903]: 2025-10-06 14:26:51.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.851 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6a496635-6439-496d-91bb-401c3912be24]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.852 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.852 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[db4e4e78-35cd-45d3-a362-590aba3cfaf8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.853 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:26:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:51.853 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'env', 'PROCESS_TAG=haproxy-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55ccf1b2-d24e-4063-b15b-60a65227d75e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:26:52 compute-0 podman[226869]: 2025-10-06 14:26:52.279069811 +0000 UTC m=+0.051685412 container create 48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Oct 06 14:26:52 compute-0 systemd[1]: Started libpod-conmon-48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0.scope.
Oct 06 14:26:52 compute-0 podman[226869]: 2025-10-06 14:26:52.250228604 +0000 UTC m=+0.022844235 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:26:52 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:26:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af307c3a8131306c14b1cd564ad8c750d39b9ac837c87be17f4702cd0e877049/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:26:52 compute-0 podman[226869]: 2025-10-06 14:26:52.37569954 +0000 UTC m=+0.148315231 container init 48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_managed=true)
Oct 06 14:26:52 compute-0 podman[226869]: 2025-10-06 14:26:52.382112565 +0000 UTC m=+0.154728206 container start 48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 14:26:52 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[226884]: [NOTICE]   (226888) : New worker (226890) forked
Oct 06 14:26:52 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[226884]: [NOTICE]   (226888) : Loading success.
Oct 06 14:26:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:54.141 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:26:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:54.142 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:26:54 compute-0 nova_compute[192903]: 2025-10-06 14:26:54.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:54 compute-0 ovn_controller[95205]: 2025-10-06T14:26:54Z|00241|binding|INFO|Claiming lport 3d929184-7221-43be-9263-037768dac50b for this chassis.
Oct 06 14:26:54 compute-0 ovn_controller[95205]: 2025-10-06T14:26:54Z|00242|binding|INFO|3d929184-7221-43be-9263-037768dac50b: Claiming fa:16:3e:a7:5e:28 10.100.0.11
Oct 06 14:26:54 compute-0 ovn_controller[95205]: 2025-10-06T14:26:54Z|00243|binding|INFO|Setting lport 3d929184-7221-43be-9263-037768dac50b up in Southbound
Oct 06 14:26:54 compute-0 podman[226917]: 2025-10-06 14:26:54.252041654 +0000 UTC m=+0.090835061 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:26:54 compute-0 podman[226916]: 2025-10-06 14:26:54.264382071 +0000 UTC m=+0.105485781 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible)
Oct 06 14:26:54 compute-0 podman[226915]: 2025-10-06 14:26:54.271916836 +0000 UTC m=+0.118153407 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:26:54 compute-0 podman[226914]: 2025-10-06 14:26:54.285848687 +0000 UTC m=+0.133041144 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Oct 06 14:26:54 compute-0 nova_compute[192903]: 2025-10-06 14:26:54.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:55 compute-0 nova_compute[192903]: 2025-10-06 14:26:55.253 2 INFO nova.compute.manager [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Post operation of migration started
Oct 06 14:26:55 compute-0 nova_compute[192903]: 2025-10-06 14:26:55.254 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:55 compute-0 nova_compute[192903]: 2025-10-06 14:26:55.336 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:55 compute-0 nova_compute[192903]: 2025-10-06 14:26:55.338 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:55 compute-0 nova_compute[192903]: 2025-10-06 14:26:55.495 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-b5e5a9ae-529f-46a2-9288-091ff73bf07f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:26:55 compute-0 nova_compute[192903]: 2025-10-06 14:26:55.495 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-b5e5a9ae-529f-46a2-9288-091ff73bf07f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:26:55 compute-0 nova_compute[192903]: 2025-10-06 14:26:55.496 2 DEBUG nova.network.neutron [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:26:56 compute-0 nova_compute[192903]: 2025-10-06 14:26:56.002 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:56 compute-0 nova_compute[192903]: 2025-10-06 14:26:56.392 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:26:56 compute-0 nova_compute[192903]: 2025-10-06 14:26:56.633 2 DEBUG nova.network.neutron [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Updating instance_info_cache with network_info: [{"id": "3d929184-7221-43be-9263-037768dac50b", "address": "fa:16:3e:a7:5e:28", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d929184-72", "ovs_interfaceid": "3d929184-7221-43be-9263-037768dac50b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:26:56 compute-0 nova_compute[192903]: 2025-10-06 14:26:56.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:57 compute-0 nova_compute[192903]: 2025-10-06 14:26:57.142 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-b5e5a9ae-529f-46a2-9288-091ff73bf07f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:26:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:26:57.145 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:26:57 compute-0 nova_compute[192903]: 2025-10-06 14:26:57.665 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:26:57 compute-0 nova_compute[192903]: 2025-10-06 14:26:57.667 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:26:57 compute-0 nova_compute[192903]: 2025-10-06 14:26:57.667 2 DEBUG oslo_concurrency.lockutils [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:26:57 compute-0 nova_compute[192903]: 2025-10-06 14:26:57.674 2 INFO nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:26:57 compute-0 virtqemud[192802]: Domain id=21 name='instance-0000001b' uuid=b5e5a9ae-529f-46a2-9288-091ff73bf07f is tainted: custom-monitor
Oct 06 14:26:58 compute-0 nova_compute[192903]: 2025-10-06 14:26:58.683 2 INFO nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:26:59 compute-0 nova_compute[192903]: 2025-10-06 14:26:59.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:26:59 compute-0 nova_compute[192903]: 2025-10-06 14:26:59.691 2 INFO nova.virt.libvirt.driver [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:26:59 compute-0 nova_compute[192903]: 2025-10-06 14:26:59.696 2 DEBUG nova.compute.manager [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:26:59 compute-0 podman[203308]: time="2025-10-06T14:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:26:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:26:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3480 "" "Go-http-client/1.1"
Oct 06 14:27:00 compute-0 nova_compute[192903]: 2025-10-06 14:27:00.208 2 DEBUG nova.objects.instance [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:27:00 compute-0 nova_compute[192903]: 2025-10-06 14:27:00.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:27:01 compute-0 nova_compute[192903]: 2025-10-06 14:27:01.227 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:01 compute-0 openstack_network_exporter[205500]: ERROR   14:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:27:01 compute-0 openstack_network_exporter[205500]: ERROR   14:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:27:01 compute-0 openstack_network_exporter[205500]: ERROR   14:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:27:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:27:01 compute-0 openstack_network_exporter[205500]: ERROR   14:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:27:01 compute-0 openstack_network_exporter[205500]: ERROR   14:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:27:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:27:01 compute-0 nova_compute[192903]: 2025-10-06 14:27:01.551 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:01 compute-0 nova_compute[192903]: 2025-10-06 14:27:01.551 2 WARNING neutronclient.v2_0.client [None req-aeaad016-0c78-4c1a-b8b0-e0c47857af51 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:01 compute-0 nova_compute[192903]: 2025-10-06 14:27:01.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:02 compute-0 nova_compute[192903]: 2025-10-06 14:27:02.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:27:03 compute-0 nova_compute[192903]: 2025-10-06 14:27:03.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:03 compute-0 nova_compute[192903]: 2025-10-06 14:27:03.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:03 compute-0 nova_compute[192903]: 2025-10-06 14:27:03.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:03 compute-0 nova_compute[192903]: 2025-10-06 14:27:03.097 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.156 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.258 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.259 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.331 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.487 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.488 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.507 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.508 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5665MB free_disk=73.2712631225586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.509 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.509 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:04 compute-0 nova_compute[192903]: 2025-10-06 14:27:04.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:05 compute-0 podman[227011]: 2025-10-06 14:27:05.208469792 +0000 UTC m=+0.068416089 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:27:05 compute-0 nova_compute[192903]: 2025-10-06 14:27:05.528 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Applying migration context for instance b5e5a9ae-529f-46a2-9288-091ff73bf07f as it has an incoming, in-progress migration def21838-dfd3-4c87-8723-1038cc06383d. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 06 14:27:05 compute-0 nova_compute[192903]: 2025-10-06 14:27:05.529 2 DEBUG nova.objects.instance [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:27:06 compute-0 nova_compute[192903]: 2025-10-06 14:27:06.036 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Migration for instance f1e54903-0242-47cc-9a49-a10112fb0f51 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 06 14:27:06 compute-0 nova_compute[192903]: 2025-10-06 14:27:06.037 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 06 14:27:06 compute-0 nova_compute[192903]: 2025-10-06 14:27:06.545 2 INFO nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Updating resource usage from migration aa31ab55-66ba-4c1e-ae5c-fb9ca690ae1d
Oct 06 14:27:06 compute-0 nova_compute[192903]: 2025-10-06 14:27:06.546 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Starting to track incoming migration aa31ab55-66ba-4c1e-ae5c-fb9ca690ae1d with flavor 8cb06c85-e9e7-417f-906b-1f7cf29f7de9 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 06 14:27:06 compute-0 nova_compute[192903]: 2025-10-06 14:27:06.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:07 compute-0 nova_compute[192903]: 2025-10-06 14:27:07.093 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance b5e5a9ae-529f-46a2-9288-091ff73bf07f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:27:07 compute-0 nova_compute[192903]: 2025-10-06 14:27:07.600 2 WARNING nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance f1e54903-0242-47cc-9a49-a10112fb0f51 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 06 14:27:07 compute-0 nova_compute[192903]: 2025-10-06 14:27:07.601 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:27:07 compute-0 nova_compute[192903]: 2025-10-06 14:27:07.602 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:27:04 up  1:28,  0 user,  load average: 0.28, 0.22, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_8f3f3b7d20fc4715811486da569fc0ab': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:27:07 compute-0 nova_compute[192903]: 2025-10-06 14:27:07.706 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:27:08 compute-0 nova_compute[192903]: 2025-10-06 14:27:08.214 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:27:08 compute-0 nova_compute[192903]: 2025-10-06 14:27:08.729 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:27:08 compute-0 nova_compute[192903]: 2025-10-06 14:27:08.729 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.220s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:09 compute-0 podman[227031]: 2025-10-06 14:27:09.198777608 +0000 UTC m=+0.055228479 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 06 14:27:09 compute-0 nova_compute[192903]: 2025-10-06 14:27:09.598 2 DEBUG nova.compute.manager [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkltwqzq_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f1e54903-0242-47cc-9a49-a10112fb0f51',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:27:09 compute-0 nova_compute[192903]: 2025-10-06 14:27:09.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:10 compute-0 nova_compute[192903]: 2025-10-06 14:27:10.612 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-f1e54903-0242-47cc-9a49-a10112fb0f51" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:27:10 compute-0 nova_compute[192903]: 2025-10-06 14:27:10.614 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-f1e54903-0242-47cc-9a49-a10112fb0f51" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:27:10 compute-0 nova_compute[192903]: 2025-10-06 14:27:10.614 2 DEBUG nova.network.neutron [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:27:11 compute-0 nova_compute[192903]: 2025-10-06 14:27:11.121 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:11.400 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:11.401 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:11.402 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:11 compute-0 nova_compute[192903]: 2025-10-06 14:27:11.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:11 compute-0 nova_compute[192903]: 2025-10-06 14:27:11.725 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:27:11 compute-0 nova_compute[192903]: 2025-10-06 14:27:11.726 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:27:12 compute-0 nova_compute[192903]: 2025-10-06 14:27:12.238 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:27:12 compute-0 nova_compute[192903]: 2025-10-06 14:27:12.239 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:27:12 compute-0 nova_compute[192903]: 2025-10-06 14:27:12.239 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:27:12 compute-0 nova_compute[192903]: 2025-10-06 14:27:12.643 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:13 compute-0 nova_compute[192903]: 2025-10-06 14:27:13.533 2 DEBUG nova.network.neutron [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Updating instance_info_cache with network_info: [{"id": "bda5a10c-8301-40a8-94d3-776e40349dfa", "address": "fa:16:3e:60:a6:ec", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbda5a10c-83", "ovs_interfaceid": "bda5a10c-8301-40a8-94d3-776e40349dfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.059 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-f1e54903-0242-47cc-9a49-a10112fb0f51" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.079 2 DEBUG nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkltwqzq_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f1e54903-0242-47cc-9a49-a10112fb0f51',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.080 2 DEBUG nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Creating instance directory: /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.081 2 DEBUG nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Creating disk.info with the contents: {'/var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk': 'qcow2', '/var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.081 2 DEBUG nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.082 2 DEBUG nova.objects.instance [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid f1e54903-0242-47cc-9a49-a10112fb0f51 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.588 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.594 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.596 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.682 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.684 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.685 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.686 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.692 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.693 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.783 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.784 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.826 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.827 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.828 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.892 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.893 2 DEBUG nova.virt.disk.api [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.894 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.963 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.964 2 DEBUG nova.virt.disk.api [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:27:14 compute-0 nova_compute[192903]: 2025-10-06 14:27:14.965 2 DEBUG nova.objects.instance [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid f1e54903-0242-47cc-9a49-a10112fb0f51 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.475 2 DEBUG nova.objects.base [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<f1e54903-0242-47cc-9a49-a10112fb0f51> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.476 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.503 2 DEBUG oslo_concurrency.processutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51/disk.config 497664" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.504 2 DEBUG nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.505 2 DEBUG nova.virt.libvirt.vif [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:25:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-422652160',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-422652160',id=26,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:25:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-7abwsol2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:25:43Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=f1e54903-0242-47cc-9a49-a10112fb0f51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bda5a10c-8301-40a8-94d3-776e40349dfa", "address": "fa:16:3e:60:a6:ec", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbda5a10c-83", "ovs_interfaceid": "bda5a10c-8301-40a8-94d3-776e40349dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.505 2 DEBUG nova.network.os_vif_util [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "bda5a10c-8301-40a8-94d3-776e40349dfa", "address": "fa:16:3e:60:a6:ec", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbda5a10c-83", "ovs_interfaceid": "bda5a10c-8301-40a8-94d3-776e40349dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.506 2 DEBUG nova.network.os_vif_util [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:a6:ec,bridge_name='br-int',has_traffic_filtering=True,id=bda5a10c-8301-40a8-94d3-776e40349dfa,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbda5a10c-83') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.507 2 DEBUG os_vif [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:a6:ec,bridge_name='br-int',has_traffic_filtering=True,id=bda5a10c-8301-40a8-94d3-776e40349dfa,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbda5a10c-83') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.508 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.508 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.509 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9eb0523e-e227-51e3-a7f1-d4308da8eaa8', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.514 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbda5a10c-83, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.514 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbda5a10c-83, col_values=(('qos', UUID('5ba73ee8-b7b0-438a-a5a8-49c0f102bd86')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.514 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbda5a10c-83, col_values=(('external_ids', {'iface-id': 'bda5a10c-8301-40a8-94d3-776e40349dfa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:a6:ec', 'vm-uuid': 'f1e54903-0242-47cc-9a49-a10112fb0f51'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:27:15 compute-0 NetworkManager[52035]: <info>  [1759760835.5181] manager: (tapbda5a10c-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.524 2 INFO os_vif [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:a6:ec,bridge_name='br-int',has_traffic_filtering=True,id=bda5a10c-8301-40a8-94d3-776e40349dfa,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbda5a10c-83')
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.524 2 DEBUG nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.525 2 DEBUG nova.compute.manager [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkltwqzq_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f1e54903-0242-47cc-9a49-a10112fb0f51',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.525 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:15 compute-0 nova_compute[192903]: 2025-10-06 14:27:15.616 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:16 compute-0 nova_compute[192903]: 2025-10-06 14:27:16.692 2 DEBUG nova.network.neutron [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Port bda5a10c-8301-40a8-94d3-776e40349dfa updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:27:16 compute-0 nova_compute[192903]: 2025-10-06 14:27:16.713 2 DEBUG nova.compute.manager [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkltwqzq_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f1e54903-0242-47cc-9a49-a10112fb0f51',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:27:17 compute-0 nova_compute[192903]: 2025-10-06 14:27:17.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:27:17 compute-0 nova_compute[192903]: 2025-10-06 14:27:17.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:27:19 compute-0 nova_compute[192903]: 2025-10-06 14:27:19.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:19 compute-0 kernel: tapbda5a10c-83: entered promiscuous mode
Oct 06 14:27:19 compute-0 NetworkManager[52035]: <info>  [1759760839.7646] manager: (tapbda5a10c-83): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Oct 06 14:27:19 compute-0 ovn_controller[95205]: 2025-10-06T14:27:19Z|00244|binding|INFO|Claiming lport bda5a10c-8301-40a8-94d3-776e40349dfa for this additional chassis.
Oct 06 14:27:19 compute-0 ovn_controller[95205]: 2025-10-06T14:27:19Z|00245|binding|INFO|bda5a10c-8301-40a8-94d3-776e40349dfa: Claiming fa:16:3e:60:a6:ec 10.100.0.10
Oct 06 14:27:19 compute-0 nova_compute[192903]: 2025-10-06 14:27:19.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.775 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:a6:ec 10.100.0.10'], port_security=['fa:16:3e:60:a6:ec 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f1e54903-0242-47cc-9a49-a10112fb0f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=bda5a10c-8301-40a8-94d3-776e40349dfa) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.777 104072 INFO neutron.agent.ovn.metadata.agent [-] Port bda5a10c-8301-40a8-94d3-776e40349dfa in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.778 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:27:19 compute-0 ovn_controller[95205]: 2025-10-06T14:27:19Z|00246|binding|INFO|Setting lport bda5a10c-8301-40a8-94d3-776e40349dfa ovn-installed in OVS
Oct 06 14:27:19 compute-0 nova_compute[192903]: 2025-10-06 14:27:19.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:19 compute-0 nova_compute[192903]: 2025-10-06 14:27:19.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:19 compute-0 nova_compute[192903]: 2025-10-06 14:27:19.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.802 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[df2d133b-c483-4b69-a90c-c19c2466bd9f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:19 compute-0 systemd-machined[152985]: New machine qemu-22-instance-0000001a.
Oct 06 14:27:19 compute-0 systemd-udevd[227089]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:27:19 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001a.
Oct 06 14:27:19 compute-0 NetworkManager[52035]: <info>  [1759760839.8378] device (tapbda5a10c-83): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:27:19 compute-0 NetworkManager[52035]: <info>  [1759760839.8409] device (tapbda5a10c-83): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.845 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[486e667a-6a97-4aad-ba03-2599dfbf639e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.848 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[31295011-e98c-4d52-9464-551fd42e961e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.889 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[18a98003-c55b-4918-bc9b-e9b49255b070]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.912 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[983ad67f-b04c-4f43-9474-f5d7102fffc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527326, 'reachable_time': 42908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227100, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.929 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[efff8aa9-bd2a-433d-8ced-b1113c0675c1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527341, 'tstamp': 527341}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227102, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527346, 'tstamp': 527346}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227102, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.930 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:19 compute-0 nova_compute[192903]: 2025-10-06 14:27:19.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:19 compute-0 nova_compute[192903]: 2025-10-06 14:27:19.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.935 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.935 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.936 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.936 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:27:19 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:19.939 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c13869f0-98c8-4eae-9d9a-e6ee67949d0f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:20 compute-0 nova_compute[192903]: 2025-10-06 14:27:20.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:22 compute-0 ovn_controller[95205]: 2025-10-06T14:27:22Z|00247|binding|INFO|Claiming lport bda5a10c-8301-40a8-94d3-776e40349dfa for this chassis.
Oct 06 14:27:22 compute-0 ovn_controller[95205]: 2025-10-06T14:27:22Z|00248|binding|INFO|bda5a10c-8301-40a8-94d3-776e40349dfa: Claiming fa:16:3e:60:a6:ec 10.100.0.10
Oct 06 14:27:22 compute-0 ovn_controller[95205]: 2025-10-06T14:27:22Z|00249|binding|INFO|Setting lport bda5a10c-8301-40a8-94d3-776e40349dfa up in Southbound
Oct 06 14:27:24 compute-0 nova_compute[192903]: 2025-10-06 14:27:24.068 2 INFO nova.compute.manager [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Post operation of migration started
Oct 06 14:27:24 compute-0 nova_compute[192903]: 2025-10-06 14:27:24.069 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:24 compute-0 nova_compute[192903]: 2025-10-06 14:27:24.334 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:24 compute-0 nova_compute[192903]: 2025-10-06 14:27:24.335 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:24 compute-0 nova_compute[192903]: 2025-10-06 14:27:24.439 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-f1e54903-0242-47cc-9a49-a10112fb0f51" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:27:24 compute-0 nova_compute[192903]: 2025-10-06 14:27:24.439 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-f1e54903-0242-47cc-9a49-a10112fb0f51" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:27:24 compute-0 nova_compute[192903]: 2025-10-06 14:27:24.440 2 DEBUG nova.network.neutron [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:27:24 compute-0 nova_compute[192903]: 2025-10-06 14:27:24.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:24 compute-0 nova_compute[192903]: 2025-10-06 14:27:24.947 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:25 compute-0 podman[227128]: 2025-10-06 14:27:25.210918751 +0000 UTC m=+0.055765463 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:27:25 compute-0 podman[227126]: 2025-10-06 14:27:25.217227004 +0000 UTC m=+0.068907403 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 14:27:25 compute-0 podman[227127]: 2025-10-06 14:27:25.229334414 +0000 UTC m=+0.077241700 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Oct 06 14:27:25 compute-0 podman[227125]: 2025-10-06 14:27:25.254720858 +0000 UTC m=+0.112988747 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 06 14:27:25 compute-0 nova_compute[192903]: 2025-10-06 14:27:25.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:25 compute-0 nova_compute[192903]: 2025-10-06 14:27:25.865 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:26 compute-0 nova_compute[192903]: 2025-10-06 14:27:26.012 2 DEBUG nova.network.neutron [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Updating instance_info_cache with network_info: [{"id": "bda5a10c-8301-40a8-94d3-776e40349dfa", "address": "fa:16:3e:60:a6:ec", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbda5a10c-83", "ovs_interfaceid": "bda5a10c-8301-40a8-94d3-776e40349dfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:27:26 compute-0 nova_compute[192903]: 2025-10-06 14:27:26.521 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-f1e54903-0242-47cc-9a49-a10112fb0f51" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:27:27 compute-0 nova_compute[192903]: 2025-10-06 14:27:27.059 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:27 compute-0 nova_compute[192903]: 2025-10-06 14:27:27.061 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:27 compute-0 nova_compute[192903]: 2025-10-06 14:27:27.061 2 DEBUG oslo_concurrency.lockutils [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:27 compute-0 nova_compute[192903]: 2025-10-06 14:27:27.068 2 INFO nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:27:27 compute-0 virtqemud[192802]: Domain id=22 name='instance-0000001a' uuid=f1e54903-0242-47cc-9a49-a10112fb0f51 is tainted: custom-monitor
Oct 06 14:27:28 compute-0 nova_compute[192903]: 2025-10-06 14:27:28.081 2 INFO nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:27:29 compute-0 nova_compute[192903]: 2025-10-06 14:27:29.089 2 INFO nova.virt.libvirt.driver [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:27:29 compute-0 nova_compute[192903]: 2025-10-06 14:27:29.096 2 DEBUG nova.compute.manager [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:27:29 compute-0 nova_compute[192903]: 2025-10-06 14:27:29.608 2 DEBUG nova.objects.instance [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:27:29 compute-0 nova_compute[192903]: 2025-10-06 14:27:29.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:29 compute-0 podman[203308]: time="2025-10-06T14:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:27:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:27:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3482 "" "Go-http-client/1.1"
Oct 06 14:27:30 compute-0 nova_compute[192903]: 2025-10-06 14:27:30.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:30 compute-0 nova_compute[192903]: 2025-10-06 14:27:30.626 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:31 compute-0 openstack_network_exporter[205500]: ERROR   14:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:27:31 compute-0 openstack_network_exporter[205500]: ERROR   14:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:27:31 compute-0 openstack_network_exporter[205500]: ERROR   14:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:27:31 compute-0 openstack_network_exporter[205500]: ERROR   14:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:27:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:27:31 compute-0 openstack_network_exporter[205500]: ERROR   14:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:27:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:27:31 compute-0 nova_compute[192903]: 2025-10-06 14:27:31.515 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:31 compute-0 nova_compute[192903]: 2025-10-06 14:27:31.516 2 WARNING neutronclient.v2_0.client [None req-f9d070c4-61e3-4604-a6d2-9daf8d28623d f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:34 compute-0 nova_compute[192903]: 2025-10-06 14:27:34.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:35 compute-0 nova_compute[192903]: 2025-10-06 14:27:35.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:35 compute-0 nova_compute[192903]: 2025-10-06 14:27:35.964 2 DEBUG oslo_concurrency.lockutils [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:35 compute-0 nova_compute[192903]: 2025-10-06 14:27:35.964 2 DEBUG oslo_concurrency.lockutils [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:35 compute-0 nova_compute[192903]: 2025-10-06 14:27:35.965 2 DEBUG oslo_concurrency.lockutils [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:35 compute-0 nova_compute[192903]: 2025-10-06 14:27:35.965 2 DEBUG oslo_concurrency.lockutils [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:35 compute-0 nova_compute[192903]: 2025-10-06 14:27:35.966 2 DEBUG oslo_concurrency.lockutils [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:35 compute-0 nova_compute[192903]: 2025-10-06 14:27:35.982 2 INFO nova.compute.manager [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Terminating instance
Oct 06 14:27:36 compute-0 podman[227214]: 2025-10-06 14:27:36.221348073 +0000 UTC m=+0.078422972 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.503 2 DEBUG nova.compute.manager [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:27:36 compute-0 kernel: tap3d929184-72 (unregistering): left promiscuous mode
Oct 06 14:27:36 compute-0 NetworkManager[52035]: <info>  [1759760856.5314] device (tap3d929184-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:36 compute-0 ovn_controller[95205]: 2025-10-06T14:27:36Z|00250|binding|INFO|Releasing lport 3d929184-7221-43be-9263-037768dac50b from this chassis (sb_readonly=0)
Oct 06 14:27:36 compute-0 ovn_controller[95205]: 2025-10-06T14:27:36Z|00251|binding|INFO|Setting lport 3d929184-7221-43be-9263-037768dac50b down in Southbound
Oct 06 14:27:36 compute-0 ovn_controller[95205]: 2025-10-06T14:27:36Z|00252|binding|INFO|Removing iface tap3d929184-72 ovn-installed in OVS
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.556 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:5e:28 10.100.0.11'], port_security=['fa:16:3e:a7:5e:28 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b5e5a9ae-529f-46a2-9288-091ff73bf07f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=3d929184-7221-43be-9263-037768dac50b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.558 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 3d929184-7221-43be-9263-037768dac50b in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.560 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55ccf1b2-d24e-4063-b15b-60a65227d75e
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.587 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[21d00075-7042-498d-ad71-c5f51d0adcbe]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:36 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Oct 06 14:27:36 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Consumed 3.763s CPU time.
Oct 06 14:27:36 compute-0 systemd-machined[152985]: Machine qemu-21-instance-0000001b terminated.
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.630 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[cc684437-7b71-4cf7-bac5-b191ae531cdd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.633 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc4451a-6f86-4d11-9a11-a916a2d6ec69]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.670 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e8301d-b3d5-4c47-93ba-16fabb459057]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.691 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e05199a0-e631-40c9-ba75-7d7a66d94732]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55ccf1b2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:aa:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527326, 'reachable_time': 42908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227250, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.702 2 DEBUG nova.compute.manager [req-1d9bf317-311e-4292-890f-e7546e3106ed req-0e1f06a9-b3e6-47a6-aac6-2f584d2756a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Received event network-vif-unplugged-3d929184-7221-43be-9263-037768dac50b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.703 2 DEBUG oslo_concurrency.lockutils [req-1d9bf317-311e-4292-890f-e7546e3106ed req-0e1f06a9-b3e6-47a6-aac6-2f584d2756a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.703 2 DEBUG oslo_concurrency.lockutils [req-1d9bf317-311e-4292-890f-e7546e3106ed req-0e1f06a9-b3e6-47a6-aac6-2f584d2756a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.703 2 DEBUG oslo_concurrency.lockutils [req-1d9bf317-311e-4292-890f-e7546e3106ed req-0e1f06a9-b3e6-47a6-aac6-2f584d2756a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.704 2 DEBUG nova.compute.manager [req-1d9bf317-311e-4292-890f-e7546e3106ed req-0e1f06a9-b3e6-47a6-aac6-2f584d2756a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] No waiting events found dispatching network-vif-unplugged-3d929184-7221-43be-9263-037768dac50b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.704 2 DEBUG nova.compute.manager [req-1d9bf317-311e-4292-890f-e7546e3106ed req-0e1f06a9-b3e6-47a6-aac6-2f584d2756a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Received event network-vif-unplugged-3d929184-7221-43be-9263-037768dac50b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.719 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[80a6589d-f9ef-4a89-acf7-c282a3a692b1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527341, 'tstamp': 527341}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227251, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap55ccf1b2-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527346, 'tstamp': 527346}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227251, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.720 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.729 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55ccf1b2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.730 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.730 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55ccf1b2-d0, col_values=(('external_ids', {'iface-id': '0ee47753-a40c-4a21-a6ed-65093b6727d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.731 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:27:36 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:36.733 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef2c1fe-a763-48e5-87ce-5a70c33b6ff8]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-55ccf1b2-d24e-4063-b15b-60a65227d75e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 55ccf1b2-d24e-4063-b15b-60a65227d75e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.778 2 INFO nova.virt.libvirt.driver [-] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Instance destroyed successfully.
Oct 06 14:27:36 compute-0 nova_compute[192903]: 2025-10-06 14:27:36.780 2 DEBUG nova.objects.instance [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'resources' on Instance uuid b5e5a9ae-529f-46a2-9288-091ff73bf07f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.290 2 DEBUG nova.virt.libvirt.vif [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:25:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-310983494',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-310983494',id=27,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:26:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-hk300dlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:27:00Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=b5e5a9ae-529f-46a2-9288-091ff73bf07f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3d929184-7221-43be-9263-037768dac50b", "address": "fa:16:3e:a7:5e:28", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d929184-72", "ovs_interfaceid": "3d929184-7221-43be-9263-037768dac50b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.292 2 DEBUG nova.network.os_vif_util [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "3d929184-7221-43be-9263-037768dac50b", "address": "fa:16:3e:a7:5e:28", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d929184-72", "ovs_interfaceid": "3d929184-7221-43be-9263-037768dac50b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.293 2 DEBUG nova.network.os_vif_util [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a7:5e:28,bridge_name='br-int',has_traffic_filtering=True,id=3d929184-7221-43be-9263-037768dac50b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d929184-72') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.294 2 DEBUG os_vif [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:5e:28,bridge_name='br-int',has_traffic_filtering=True,id=3d929184-7221-43be-9263-037768dac50b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d929184-72') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.297 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d929184-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.303 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=21c92a1d-7fd9-4af2-adff-815f428c135f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.309 2 INFO os_vif [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:5e:28,bridge_name='br-int',has_traffic_filtering=True,id=3d929184-7221-43be-9263-037768dac50b,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d929184-72')
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.309 2 INFO nova.virt.libvirt.driver [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Deleting instance files /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f_del
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.311 2 INFO nova.virt.libvirt.driver [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Deletion of /var/lib/nova/instances/b5e5a9ae-529f-46a2-9288-091ff73bf07f_del complete
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.830 2 INFO nova.compute.manager [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.831 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.832 2 DEBUG nova.compute.manager [-] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.832 2 DEBUG nova.network.neutron [-] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.833 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:37 compute-0 nova_compute[192903]: 2025-10-06 14:27:37.984 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.297 2 DEBUG nova.compute.manager [req-309b0d24-73c6-4121-a865-6cd717c5ccd2 req-7fe407a5-d4e9-476d-9c20-d8006d61707c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Received event network-vif-deleted-3d929184-7221-43be-9263-037768dac50b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.297 2 INFO nova.compute.manager [req-309b0d24-73c6-4121-a865-6cd717c5ccd2 req-7fe407a5-d4e9-476d-9c20-d8006d61707c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Neutron deleted interface 3d929184-7221-43be-9263-037768dac50b; detaching it from the instance and deleting it from the info cache
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.298 2 DEBUG nova.network.neutron [req-309b0d24-73c6-4121-a865-6cd717c5ccd2 req-7fe407a5-d4e9-476d-9c20-d8006d61707c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.742 2 DEBUG nova.network.neutron [-] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.778 2 DEBUG nova.compute.manager [req-6da06a17-10d0-40f7-953f-1739bbcc0787 req-1f43fc08-0c63-4e8e-be0a-fe5040773459 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Received event network-vif-unplugged-3d929184-7221-43be-9263-037768dac50b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.779 2 DEBUG oslo_concurrency.lockutils [req-6da06a17-10d0-40f7-953f-1739bbcc0787 req-1f43fc08-0c63-4e8e-be0a-fe5040773459 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.780 2 DEBUG oslo_concurrency.lockutils [req-6da06a17-10d0-40f7-953f-1739bbcc0787 req-1f43fc08-0c63-4e8e-be0a-fe5040773459 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.780 2 DEBUG oslo_concurrency.lockutils [req-6da06a17-10d0-40f7-953f-1739bbcc0787 req-1f43fc08-0c63-4e8e-be0a-fe5040773459 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.781 2 DEBUG nova.compute.manager [req-6da06a17-10d0-40f7-953f-1739bbcc0787 req-1f43fc08-0c63-4e8e-be0a-fe5040773459 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] No waiting events found dispatching network-vif-unplugged-3d929184-7221-43be-9263-037768dac50b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.781 2 DEBUG nova.compute.manager [req-6da06a17-10d0-40f7-953f-1739bbcc0787 req-1f43fc08-0c63-4e8e-be0a-fe5040773459 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Received event network-vif-unplugged-3d929184-7221-43be-9263-037768dac50b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:27:38 compute-0 nova_compute[192903]: 2025-10-06 14:27:38.807 2 DEBUG nova.compute.manager [req-309b0d24-73c6-4121-a865-6cd717c5ccd2 req-7fe407a5-d4e9-476d-9c20-d8006d61707c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Detach interface failed, port_id=3d929184-7221-43be-9263-037768dac50b, reason: Instance b5e5a9ae-529f-46a2-9288-091ff73bf07f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.251 2 INFO nova.compute.manager [-] [instance: b5e5a9ae-529f-46a2-9288-091ff73bf07f] Took 1.42 seconds to deallocate network for instance.
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.775 2 DEBUG oslo_concurrency.lockutils [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.775 2 DEBUG oslo_concurrency.lockutils [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.813 2 DEBUG nova.scheduler.client.report [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.839 2 DEBUG nova.scheduler.client.report [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.840 2 DEBUG nova.compute.provider_tree [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.855 2 DEBUG nova.scheduler.client.report [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.874 2 DEBUG nova.scheduler.client.report [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 14:27:39 compute-0 nova_compute[192903]: 2025-10-06 14:27:39.929 2 DEBUG nova.compute.provider_tree [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:27:40 compute-0 podman[227269]: 2025-10-06 14:27:40.210741704 +0000 UTC m=+0.071741460 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, maintainer=Red Hat, Inc., architecture=x86_64)
Oct 06 14:27:40 compute-0 nova_compute[192903]: 2025-10-06 14:27:40.437 2 DEBUG nova.scheduler.client.report [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:27:40 compute-0 nova_compute[192903]: 2025-10-06 14:27:40.952 2 DEBUG oslo_concurrency.lockutils [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.176s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:40 compute-0 nova_compute[192903]: 2025-10-06 14:27:40.972 2 INFO nova.scheduler.client.report [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Deleted allocations for instance b5e5a9ae-529f-46a2-9288-091ff73bf07f
Oct 06 14:27:42 compute-0 nova_compute[192903]: 2025-10-06 14:27:42.006 2 DEBUG oslo_concurrency.lockutils [None req-5c610a2d-ca22-4463-ba68-a24298d3897a 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "b5e5a9ae-529f-46a2-9288-091ff73bf07f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.042s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:42 compute-0 nova_compute[192903]: 2025-10-06 14:27:42.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:42 compute-0 nova_compute[192903]: 2025-10-06 14:27:42.560 2 DEBUG oslo_concurrency.lockutils [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "f1e54903-0242-47cc-9a49-a10112fb0f51" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:42 compute-0 nova_compute[192903]: 2025-10-06 14:27:42.560 2 DEBUG oslo_concurrency.lockutils [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "f1e54903-0242-47cc-9a49-a10112fb0f51" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:42 compute-0 nova_compute[192903]: 2025-10-06 14:27:42.561 2 DEBUG oslo_concurrency.lockutils [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "f1e54903-0242-47cc-9a49-a10112fb0f51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:42 compute-0 nova_compute[192903]: 2025-10-06 14:27:42.561 2 DEBUG oslo_concurrency.lockutils [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "f1e54903-0242-47cc-9a49-a10112fb0f51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:42 compute-0 nova_compute[192903]: 2025-10-06 14:27:42.562 2 DEBUG oslo_concurrency.lockutils [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "f1e54903-0242-47cc-9a49-a10112fb0f51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:42 compute-0 nova_compute[192903]: 2025-10-06 14:27:42.581 2 INFO nova.compute.manager [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Terminating instance
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.105 2 DEBUG nova.compute.manager [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:27:43 compute-0 kernel: tapbda5a10c-83 (unregistering): left promiscuous mode
Oct 06 14:27:43 compute-0 NetworkManager[52035]: <info>  [1759760863.1302] device (tapbda5a10c-83): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 ovn_controller[95205]: 2025-10-06T14:27:43Z|00253|binding|INFO|Releasing lport bda5a10c-8301-40a8-94d3-776e40349dfa from this chassis (sb_readonly=0)
Oct 06 14:27:43 compute-0 ovn_controller[95205]: 2025-10-06T14:27:43Z|00254|binding|INFO|Setting lport bda5a10c-8301-40a8-94d3-776e40349dfa down in Southbound
Oct 06 14:27:43 compute-0 ovn_controller[95205]: 2025-10-06T14:27:43Z|00255|binding|INFO|Removing iface tapbda5a10c-83 ovn-installed in OVS
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.150 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:a6:ec 10.100.0.10'], port_security=['fa:16:3e:60:a6:ec 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f1e54903-0242-47cc-9a49-a10112fb0f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=bda5a10c-8301-40a8-94d3-776e40349dfa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.151 104072 INFO neutron.agent.ovn.metadata.agent [-] Port bda5a10c-8301-40a8-94d3-776e40349dfa in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.153 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.154 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[49dc1e1f-0f42-45ad-afa9-d5443db13cde]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.154 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e namespace which is not needed anymore
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Oct 06 14:27:43 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001a.scope: Consumed 2.859s CPU time.
Oct 06 14:27:43 compute-0 systemd-machined[152985]: Machine qemu-22-instance-0000001a terminated.
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.300 2 DEBUG nova.compute.manager [req-958817ec-76c4-415a-b327-136dfe02b67e req-785e5f0e-8f16-4c7c-92fa-4e4e62e0a16c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Received event network-vif-unplugged-bda5a10c-8301-40a8-94d3-776e40349dfa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.301 2 DEBUG oslo_concurrency.lockutils [req-958817ec-76c4-415a-b327-136dfe02b67e req-785e5f0e-8f16-4c7c-92fa-4e4e62e0a16c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f1e54903-0242-47cc-9a49-a10112fb0f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.301 2 DEBUG oslo_concurrency.lockutils [req-958817ec-76c4-415a-b327-136dfe02b67e req-785e5f0e-8f16-4c7c-92fa-4e4e62e0a16c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f1e54903-0242-47cc-9a49-a10112fb0f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.301 2 DEBUG oslo_concurrency.lockutils [req-958817ec-76c4-415a-b327-136dfe02b67e req-785e5f0e-8f16-4c7c-92fa-4e4e62e0a16c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f1e54903-0242-47cc-9a49-a10112fb0f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.301 2 DEBUG nova.compute.manager [req-958817ec-76c4-415a-b327-136dfe02b67e req-785e5f0e-8f16-4c7c-92fa-4e4e62e0a16c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] No waiting events found dispatching network-vif-unplugged-bda5a10c-8301-40a8-94d3-776e40349dfa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.302 2 DEBUG nova.compute.manager [req-958817ec-76c4-415a-b327-136dfe02b67e req-785e5f0e-8f16-4c7c-92fa-4e4e62e0a16c e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Received event network-vif-unplugged-bda5a10c-8301-40a8-94d3-776e40349dfa for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:27:43 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[226884]: [NOTICE]   (226888) : haproxy version is 3.0.5-8e879a5
Oct 06 14:27:43 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[226884]: [NOTICE]   (226888) : path to executable is /usr/sbin/haproxy
Oct 06 14:27:43 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[226884]: [WARNING]  (226888) : Exiting Master process...
Oct 06 14:27:43 compute-0 podman[227315]: 2025-10-06 14:27:43.321469944 +0000 UTC m=+0.044076084 container kill 48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Oct 06 14:27:43 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[226884]: [ALERT]    (226888) : Current worker (226890) exited with code 143 (Terminated)
Oct 06 14:27:43 compute-0 neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e[226884]: [WARNING]  (226888) : All workers exited. Exiting... (0)
Oct 06 14:27:43 compute-0 systemd[1]: libpod-48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0.scope: Deactivated successfully.
Oct 06 14:27:43 compute-0 NetworkManager[52035]: <info>  [1759760863.3309] manager: (tapbda5a10c-83): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Oct 06 14:27:43 compute-0 kernel: tapbda5a10c-83: entered promiscuous mode
Oct 06 14:27:43 compute-0 kernel: tapbda5a10c-83 (unregistering): left promiscuous mode
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 ovn_controller[95205]: 2025-10-06T14:27:43Z|00256|binding|INFO|Claiming lport bda5a10c-8301-40a8-94d3-776e40349dfa for this chassis.
Oct 06 14:27:43 compute-0 ovn_controller[95205]: 2025-10-06T14:27:43Z|00257|binding|INFO|bda5a10c-8301-40a8-94d3-776e40349dfa: Claiming fa:16:3e:60:a6:ec 10.100.0.10
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.376 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:a6:ec 10.100.0.10'], port_security=['fa:16:3e:60:a6:ec 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f1e54903-0242-47cc-9a49-a10112fb0f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=bda5a10c-8301-40a8-94d3-776e40349dfa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:27:43 compute-0 ovn_controller[95205]: 2025-10-06T14:27:43Z|00258|binding|INFO|Releasing lport bda5a10c-8301-40a8-94d3-776e40349dfa from this chassis (sb_readonly=0)
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 podman[227334]: 2025-10-06 14:27:43.392536214 +0000 UTC m=+0.037289419 container died 48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.396 2 INFO nova.virt.libvirt.driver [-] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Instance destroyed successfully.
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.396 2 DEBUG nova.objects.instance [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lazy-loading 'resources' on Instance uuid f1e54903-0242-47cc-9a49-a10112fb0f51 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.399 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:a6:ec 10.100.0.10'], port_security=['fa:16:3e:60:a6:ec 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f1e54903-0242-47cc-9a49-a10112fb0f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3f3b7d20fc4715811486da569fc0ab', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'ee0f1b78-b8b4-4b5b-99dc-62aebf1f3628', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c0cf4ae-6c3e-4762-8bd8-0b142a730d60, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=bda5a10c-8301-40a8-94d3-776e40349dfa) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:27:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0-userdata-shm.mount: Deactivated successfully.
Oct 06 14:27:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-af307c3a8131306c14b1cd564ad8c750d39b9ac837c87be17f4702cd0e877049-merged.mount: Deactivated successfully.
Oct 06 14:27:43 compute-0 podman[227334]: 2025-10-06 14:27:43.43487751 +0000 UTC m=+0.079630685 container cleanup 48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 14:27:43 compute-0 systemd[1]: libpod-conmon-48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0.scope: Deactivated successfully.
Oct 06 14:27:43 compute-0 podman[227338]: 2025-10-06 14:27:43.447557706 +0000 UTC m=+0.081667201 container remove 48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.452 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc328ef-8def-4e1b-b4ab-72ec17a213b7]: (4, ("Mon Oct  6 02:27:43 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0)\n48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0\nMon Oct  6 02:27:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e (48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0)\n48b1dc9c4e5bf9a46c41b686a1984359bf7f1335e4a2a0b5f0d5346197619ea0\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.453 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ae9ff6-fa89-4798-be78-096a0ef04469]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.454 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55ccf1b2-d24e-4063-b15b-60a65227d75e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.454 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[09b3dfb7-4222-4bb2-8b8a-ffab43489c77]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.455 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55ccf1b2-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 kernel: tap55ccf1b2-d0: left promiscuous mode
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.474 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb37197-626e-4927-b188-0bc83d8d5dc9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.499 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9db60c-b364-43cc-aeaa-9a4e87386912]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.500 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c655f8d4-a812-4780-a3cc-554ab38fa19e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.522 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[0173c33d-c18b-4a8e-b5fb-544e7d7a4425]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527315, 'reachable_time': 38797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227380, 'error': None, 'target': 'ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.525 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55ccf1b2-d24e-4063-b15b-60a65227d75e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.525 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[fb59f55a-536d-436e-b7f5-57d18c71d742]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.525 104072 INFO neutron.agent.ovn.metadata.agent [-] Port bda5a10c-8301-40a8-94d3-776e40349dfa in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.526 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.527 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b4386657-80de-4933-8b8e-189b6fc46886]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d55ccf1b2\x2dd24e\x2d4063\x2db15b\x2d60a65227d75e.mount: Deactivated successfully.
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.527 104072 INFO neutron.agent.ovn.metadata.agent [-] Port bda5a10c-8301-40a8-94d3-776e40349dfa in datapath 55ccf1b2-d24e-4063-b15b-60a65227d75e unbound from our chassis
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.528 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ccf1b2-d24e-4063-b15b-60a65227d75e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:27:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:43.529 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[858f286f-091d-4ce6-9770-17e79e23c2a8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.902 2 DEBUG nova.virt.libvirt.vif [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:25:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-422652160',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-422652160',id=26,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:25:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3f3b7d20fc4715811486da569fc0ab',ramdisk_id='',reservation_id='r-7abwsol2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,manager,reader',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1255317741',owner_user_name='tempest-TestExecuteStrategies-1255317741-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:27:30Z,user_data=None,user_id='98ee6da236ba42baa0fef11dcb52cbdd',uuid=f1e54903-0242-47cc-9a49-a10112fb0f51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bda5a10c-8301-40a8-94d3-776e40349dfa", "address": "fa:16:3e:60:a6:ec", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbda5a10c-83", "ovs_interfaceid": "bda5a10c-8301-40a8-94d3-776e40349dfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.903 2 DEBUG nova.network.os_vif_util [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converting VIF {"id": "bda5a10c-8301-40a8-94d3-776e40349dfa", "address": "fa:16:3e:60:a6:ec", "network": {"id": "55ccf1b2-d24e-4063-b15b-60a65227d75e", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1589468303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa26a29b35704c20a2516da6a6faa917", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbda5a10c-83", "ovs_interfaceid": "bda5a10c-8301-40a8-94d3-776e40349dfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.904 2 DEBUG nova.network.os_vif_util [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:a6:ec,bridge_name='br-int',has_traffic_filtering=True,id=bda5a10c-8301-40a8-94d3-776e40349dfa,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbda5a10c-83') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.904 2 DEBUG os_vif [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:a6:ec,bridge_name='br-int',has_traffic_filtering=True,id=bda5a10c-8301-40a8-94d3-776e40349dfa,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbda5a10c-83') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.907 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbda5a10c-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.914 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=5ba73ee8-b7b0-438a-a5a8-49c0f102bd86) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.919 2 INFO os_vif [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:a6:ec,bridge_name='br-int',has_traffic_filtering=True,id=bda5a10c-8301-40a8-94d3-776e40349dfa,network=Network(55ccf1b2-d24e-4063-b15b-60a65227d75e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbda5a10c-83')
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.920 2 INFO nova.virt.libvirt.driver [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Deleting instance files /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51_del
Oct 06 14:27:43 compute-0 nova_compute[192903]: 2025-10-06 14:27:43.921 2 INFO nova.virt.libvirt.driver [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Deletion of /var/lib/nova/instances/f1e54903-0242-47cc-9a49-a10112fb0f51_del complete
Oct 06 14:27:44 compute-0 nova_compute[192903]: 2025-10-06 14:27:44.435 2 INFO nova.compute.manager [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 06 14:27:44 compute-0 nova_compute[192903]: 2025-10-06 14:27:44.436 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:27:44 compute-0 nova_compute[192903]: 2025-10-06 14:27:44.437 2 DEBUG nova.compute.manager [-] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:27:44 compute-0 nova_compute[192903]: 2025-10-06 14:27:44.437 2 DEBUG nova.network.neutron [-] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:27:44 compute-0 nova_compute[192903]: 2025-10-06 14:27:44.438 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:44 compute-0 nova_compute[192903]: 2025-10-06 14:27:44.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:45 compute-0 nova_compute[192903]: 2025-10-06 14:27:45.357 2 DEBUG nova.compute.manager [req-292ef933-2fe7-4643-a427-54aa7d0d2356 req-ce691000-c4f9-427e-8711-407f80b38395 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Received event network-vif-unplugged-bda5a10c-8301-40a8-94d3-776e40349dfa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:27:45 compute-0 nova_compute[192903]: 2025-10-06 14:27:45.358 2 DEBUG oslo_concurrency.lockutils [req-292ef933-2fe7-4643-a427-54aa7d0d2356 req-ce691000-c4f9-427e-8711-407f80b38395 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "f1e54903-0242-47cc-9a49-a10112fb0f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:45 compute-0 nova_compute[192903]: 2025-10-06 14:27:45.359 2 DEBUG oslo_concurrency.lockutils [req-292ef933-2fe7-4643-a427-54aa7d0d2356 req-ce691000-c4f9-427e-8711-407f80b38395 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f1e54903-0242-47cc-9a49-a10112fb0f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:45 compute-0 nova_compute[192903]: 2025-10-06 14:27:45.359 2 DEBUG oslo_concurrency.lockutils [req-292ef933-2fe7-4643-a427-54aa7d0d2356 req-ce691000-c4f9-427e-8711-407f80b38395 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "f1e54903-0242-47cc-9a49-a10112fb0f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:45 compute-0 nova_compute[192903]: 2025-10-06 14:27:45.360 2 DEBUG nova.compute.manager [req-292ef933-2fe7-4643-a427-54aa7d0d2356 req-ce691000-c4f9-427e-8711-407f80b38395 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] No waiting events found dispatching network-vif-unplugged-bda5a10c-8301-40a8-94d3-776e40349dfa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:27:45 compute-0 nova_compute[192903]: 2025-10-06 14:27:45.360 2 DEBUG nova.compute.manager [req-292ef933-2fe7-4643-a427-54aa7d0d2356 req-ce691000-c4f9-427e-8711-407f80b38395 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Received event network-vif-unplugged-bda5a10c-8301-40a8-94d3-776e40349dfa for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:27:45 compute-0 nova_compute[192903]: 2025-10-06 14:27:45.649 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:27:46 compute-0 nova_compute[192903]: 2025-10-06 14:27:46.616 2 DEBUG nova.compute.manager [req-e103e369-9e08-4750-930b-d46ecf16a066 req-fdbf47f1-8fe6-4df1-8bfa-9b0282310384 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Received event network-vif-deleted-bda5a10c-8301-40a8-94d3-776e40349dfa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:27:46 compute-0 nova_compute[192903]: 2025-10-06 14:27:46.616 2 INFO nova.compute.manager [req-e103e369-9e08-4750-930b-d46ecf16a066 req-fdbf47f1-8fe6-4df1-8bfa-9b0282310384 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Neutron deleted interface bda5a10c-8301-40a8-94d3-776e40349dfa; detaching it from the instance and deleting it from the info cache
Oct 06 14:27:46 compute-0 nova_compute[192903]: 2025-10-06 14:27:46.616 2 DEBUG nova.network.neutron [req-e103e369-9e08-4750-930b-d46ecf16a066 req-fdbf47f1-8fe6-4df1-8bfa-9b0282310384 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:27:47 compute-0 nova_compute[192903]: 2025-10-06 14:27:47.038 2 DEBUG nova.network.neutron [-] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:27:47 compute-0 nova_compute[192903]: 2025-10-06 14:27:47.124 2 DEBUG nova.compute.manager [req-e103e369-9e08-4750-930b-d46ecf16a066 req-fdbf47f1-8fe6-4df1-8bfa-9b0282310384 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Detach interface failed, port_id=bda5a10c-8301-40a8-94d3-776e40349dfa, reason: Instance f1e54903-0242-47cc-9a49-a10112fb0f51 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:27:47 compute-0 nova_compute[192903]: 2025-10-06 14:27:47.546 2 INFO nova.compute.manager [-] [instance: f1e54903-0242-47cc-9a49-a10112fb0f51] Took 3.11 seconds to deallocate network for instance.
Oct 06 14:27:48 compute-0 nova_compute[192903]: 2025-10-06 14:27:48.072 2 DEBUG oslo_concurrency.lockutils [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:27:48 compute-0 nova_compute[192903]: 2025-10-06 14:27:48.073 2 DEBUG oslo_concurrency.lockutils [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:27:48 compute-0 nova_compute[192903]: 2025-10-06 14:27:48.080 2 DEBUG oslo_concurrency.lockutils [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:48 compute-0 nova_compute[192903]: 2025-10-06 14:27:48.116 2 INFO nova.scheduler.client.report [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Deleted allocations for instance f1e54903-0242-47cc-9a49-a10112fb0f51
Oct 06 14:27:48 compute-0 nova_compute[192903]: 2025-10-06 14:27:48.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:49 compute-0 nova_compute[192903]: 2025-10-06 14:27:49.152 2 DEBUG oslo_concurrency.lockutils [None req-b150611b-a26b-4efe-b18d-df8f544fedab 98ee6da236ba42baa0fef11dcb52cbdd 8f3f3b7d20fc4715811486da569fc0ab - - default default] Lock "f1e54903-0242-47cc-9a49-a10112fb0f51" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.592s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:27:49 compute-0 nova_compute[192903]: 2025-10-06 14:27:49.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:53 compute-0 nova_compute[192903]: 2025-10-06 14:27:53.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:54 compute-0 nova_compute[192903]: 2025-10-06 14:27:54.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:54 compute-0 nova_compute[192903]: 2025-10-06 14:27:54.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:54.817 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:27:54 compute-0 nova_compute[192903]: 2025-10-06 14:27:54.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:54.818 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:27:55 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:27:55.819 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:27:56 compute-0 podman[227383]: 2025-10-06 14:27:56.21892092 +0000 UTC m=+0.073324654 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct 06 14:27:56 compute-0 podman[227384]: 2025-10-06 14:27:56.234899996 +0000 UTC m=+0.076146770 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:27:56 compute-0 podman[227385]: 2025-10-06 14:27:56.234760442 +0000 UTC m=+0.068854031 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:27:56 compute-0 podman[227382]: 2025-10-06 14:27:56.308494265 +0000 UTC m=+0.160761370 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 14:27:58 compute-0 nova_compute[192903]: 2025-10-06 14:27:58.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:59 compute-0 nova_compute[192903]: 2025-10-06 14:27:59.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:27:59 compute-0 podman[203308]: time="2025-10-06T14:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:27:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:27:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 06 14:28:00 compute-0 nova_compute[192903]: 2025-10-06 14:28:00.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:28:01 compute-0 openstack_network_exporter[205500]: ERROR   14:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:28:01 compute-0 openstack_network_exporter[205500]: ERROR   14:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:28:01 compute-0 openstack_network_exporter[205500]: ERROR   14:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:28:01 compute-0 openstack_network_exporter[205500]: ERROR   14:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:28:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:28:01 compute-0 openstack_network_exporter[205500]: ERROR   14:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:28:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:28:03 compute-0 nova_compute[192903]: 2025-10-06 14:28:03.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:04 compute-0 nova_compute[192903]: 2025-10-06 14:28:04.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:28:04 compute-0 nova_compute[192903]: 2025-10-06 14:28:04.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.102 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.102 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.348 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.350 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.383 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.384 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5845MB free_disk=73.30000305175781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.385 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:05 compute-0 nova_compute[192903]: 2025-10-06 14:28:05.385 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:06 compute-0 nova_compute[192903]: 2025-10-06 14:28:06.458 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:28:06 compute-0 nova_compute[192903]: 2025-10-06 14:28:06.458 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:28:05 up  1:29,  0 user,  load average: 0.23, 0.21, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:28:06 compute-0 nova_compute[192903]: 2025-10-06 14:28:06.483 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:28:06 compute-0 nova_compute[192903]: 2025-10-06 14:28:06.993 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:28:07 compute-0 podman[227465]: 2025-10-06 14:28:07.197737368 +0000 UTC m=+0.060896263 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:28:07 compute-0 nova_compute[192903]: 2025-10-06 14:28:07.504 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:28:07 compute-0 nova_compute[192903]: 2025-10-06 14:28:07.505 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:28:08 compute-0 nova_compute[192903]: 2025-10-06 14:28:08.501 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:28:08 compute-0 nova_compute[192903]: 2025-10-06 14:28:08.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:28:08 compute-0 nova_compute[192903]: 2025-10-06 14:28:08.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:28:08 compute-0 nova_compute[192903]: 2025-10-06 14:28:08.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:09 compute-0 nova_compute[192903]: 2025-10-06 14:28:09.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:28:09 compute-0 nova_compute[192903]: 2025-10-06 14:28:09.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:11 compute-0 podman[227486]: 2025-10-06 14:28:11.214395784 +0000 UTC m=+0.069997682 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 06 14:28:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:11.403 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:11.404 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:11.404 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:28:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:12.257 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:1d:79 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84027c3a5bc24322a774ec81d91af7d9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c178903-2d5a-46e7-86ba-a026de5c67dd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=8f8e5c8a-64ad-42f5-a8bc-4f5dd2e00cbe) old=Port_Binding(mac=['fa:16:3e:b1:1d:79'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '84027c3a5bc24322a774ec81d91af7d9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:28:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:12.258 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 8f8e5c8a-64ad-42f5-a8bc-4f5dd2e00cbe in datapath a0424b4b-abad-433f-a1fd-549d3e8c60ac updated
Oct 06 14:28:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:12.260 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a0424b4b-abad-433f-a1fd-549d3e8c60ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:28:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:12.262 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[41083070-2d17-4fb4-b1af-b94de9635408]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:28:13 compute-0 nova_compute[192903]: 2025-10-06 14:28:13.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:14 compute-0 nova_compute[192903]: 2025-10-06 14:28:14.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:16 compute-0 nova_compute[192903]: 2025-10-06 14:28:16.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:28:17 compute-0 nova_compute[192903]: 2025-10-06 14:28:17.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:28:18 compute-0 nova_compute[192903]: 2025-10-06 14:28:18.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:28:18 compute-0 nova_compute[192903]: 2025-10-06 14:28:18.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:19 compute-0 nova_compute[192903]: 2025-10-06 14:28:19.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:20.608 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:a0:e7 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8590b151-dee8-404f-9d36-1902edaddf9c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8590b151-dee8-404f-9d36-1902edaddf9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d54ae6464194adf9e6c766021f7d34d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2391df23-897c-4d05-9776-e767785a98c3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=46d96a9f-f261-4efe-9740-1294391b69a8) old=Port_Binding(mac=['fa:16:3e:04:a0:e7'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8590b151-dee8-404f-9d36-1902edaddf9c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8590b151-dee8-404f-9d36-1902edaddf9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d54ae6464194adf9e6c766021f7d34d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:28:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:20.608 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 46d96a9f-f261-4efe-9740-1294391b69a8 in datapath 8590b151-dee8-404f-9d36-1902edaddf9c updated
Oct 06 14:28:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:20.609 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8590b151-dee8-404f-9d36-1902edaddf9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:28:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:20.610 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b38e6822-fc32-4bda-91c7-d7cda51ded56]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:28:23 compute-0 nova_compute[192903]: 2025-10-06 14:28:23.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:24 compute-0 nova_compute[192903]: 2025-10-06 14:28:24.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:27 compute-0 podman[227510]: 2025-10-06 14:28:27.213200523 +0000 UTC m=+0.068921962 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 06 14:28:27 compute-0 podman[227511]: 2025-10-06 14:28:27.232803318 +0000 UTC m=+0.073811596 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:28:27 compute-0 podman[227509]: 2025-10-06 14:28:27.243895261 +0000 UTC m=+0.103087915 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:28:27 compute-0 podman[227508]: 2025-10-06 14:28:27.256611128 +0000 UTC m=+0.112503362 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:28:28 compute-0 nova_compute[192903]: 2025-10-06 14:28:28.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:29 compute-0 podman[203308]: time="2025-10-06T14:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:28:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:28:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 06 14:28:29 compute-0 nova_compute[192903]: 2025-10-06 14:28:29.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:31 compute-0 openstack_network_exporter[205500]: ERROR   14:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:28:31 compute-0 openstack_network_exporter[205500]: ERROR   14:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:28:31 compute-0 openstack_network_exporter[205500]: ERROR   14:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:28:31 compute-0 openstack_network_exporter[205500]: ERROR   14:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:28:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:28:31 compute-0 openstack_network_exporter[205500]: ERROR   14:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:28:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:28:33 compute-0 nova_compute[192903]: 2025-10-06 14:28:33.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:28:33 compute-0 nova_compute[192903]: 2025-10-06 14:28:33.582 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:33 compute-0 nova_compute[192903]: 2025-10-06 14:28:33.582 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:33 compute-0 nova_compute[192903]: 2025-10-06 14:28:33.583 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:28:33 compute-0 nova_compute[192903]: 2025-10-06 14:28:33.583 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:33 compute-0 nova_compute[192903]: 2025-10-06 14:28:33.583 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:33 compute-0 nova_compute[192903]: 2025-10-06 14:28:33.584 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:28:34 compute-0 nova_compute[192903]: 2025-10-06 14:28:34.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:34 compute-0 nova_compute[192903]: 2025-10-06 14:28:34.600 2 DEBUG nova.virt.libvirt.imagecache [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Oct 06 14:28:34 compute-0 nova_compute[192903]: 2025-10-06 14:28:34.601 2 WARNING nova.virt.libvirt.imagecache [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3
Oct 06 14:28:34 compute-0 nova_compute[192903]: 2025-10-06 14:28:34.601 2 INFO nova.virt.libvirt.imagecache [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Removable base files: /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3
Oct 06 14:28:34 compute-0 nova_compute[192903]: 2025-10-06 14:28:34.602 2 INFO nova.virt.libvirt.imagecache [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3
Oct 06 14:28:34 compute-0 nova_compute[192903]: 2025-10-06 14:28:34.602 2 DEBUG nova.virt.libvirt.imagecache [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Oct 06 14:28:34 compute-0 nova_compute[192903]: 2025-10-06 14:28:34.602 2 DEBUG nova.virt.libvirt.imagecache [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Oct 06 14:28:34 compute-0 nova_compute[192903]: 2025-10-06 14:28:34.602 2 DEBUG nova.virt.libvirt.imagecache [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Oct 06 14:28:34 compute-0 nova_compute[192903]: 2025-10-06 14:28:34.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:34 compute-0 ovn_controller[95205]: 2025-10-06T14:28:34Z|00259|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 06 14:28:38 compute-0 podman[227595]: 2025-10-06 14:28:38.195544537 +0000 UTC m=+0.060019099 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 14:28:39 compute-0 nova_compute[192903]: 2025-10-06 14:28:39.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:39 compute-0 nova_compute[192903]: 2025-10-06 14:28:39.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:42 compute-0 podman[227616]: 2025-10-06 14:28:42.200425861 +0000 UTC m=+0.064876453 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal)
Oct 06 14:28:44 compute-0 nova_compute[192903]: 2025-10-06 14:28:44.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:44 compute-0 nova_compute[192903]: 2025-10-06 14:28:44.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:49 compute-0 nova_compute[192903]: 2025-10-06 14:28:49.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:49 compute-0 nova_compute[192903]: 2025-10-06 14:28:49.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:53 compute-0 nova_compute[192903]: 2025-10-06 14:28:53.504 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "97c347c0-834f-446c-8585-a132ba411853" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:53 compute-0 nova_compute[192903]: 2025-10-06 14:28:53.505 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:54 compute-0 nova_compute[192903]: 2025-10-06 14:28:54.010 2 DEBUG nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:28:54 compute-0 nova_compute[192903]: 2025-10-06 14:28:54.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:54 compute-0 nova_compute[192903]: 2025-10-06 14:28:54.556 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:54 compute-0 nova_compute[192903]: 2025-10-06 14:28:54.556 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:54 compute-0 nova_compute[192903]: 2025-10-06 14:28:54.563 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:28:54 compute-0 nova_compute[192903]: 2025-10-06 14:28:54.564 2 INFO nova.compute.claims [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:28:54 compute-0 nova_compute[192903]: 2025-10-06 14:28:54.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:55 compute-0 nova_compute[192903]: 2025-10-06 14:28:55.864 2 DEBUG nova.compute.provider_tree [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:28:56 compute-0 nova_compute[192903]: 2025-10-06 14:28:56.371 2 DEBUG nova.scheduler.client.report [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:28:56 compute-0 nova_compute[192903]: 2025-10-06 14:28:56.880 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.323s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:28:56 compute-0 nova_compute[192903]: 2025-10-06 14:28:56.881 2 DEBUG nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:28:57 compute-0 nova_compute[192903]: 2025-10-06 14:28:57.397 2 DEBUG nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:28:57 compute-0 nova_compute[192903]: 2025-10-06 14:28:57.397 2 DEBUG nova.network.neutron [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:28:57 compute-0 nova_compute[192903]: 2025-10-06 14:28:57.398 2 WARNING neutronclient.v2_0.client [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:28:57 compute-0 nova_compute[192903]: 2025-10-06 14:28:57.398 2 WARNING neutronclient.v2_0.client [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:28:57 compute-0 nova_compute[192903]: 2025-10-06 14:28:57.907 2 INFO nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:28:58 compute-0 podman[227640]: 2025-10-06 14:28:58.21091402 +0000 UTC m=+0.057618665 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 06 14:28:58 compute-0 podman[227639]: 2025-10-06 14:28:58.232765216 +0000 UTC m=+0.077069605 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 14:28:58 compute-0 podman[227646]: 2025-10-06 14:28:58.243653964 +0000 UTC m=+0.081513327 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:28:58 compute-0 podman[227638]: 2025-10-06 14:28:58.268686307 +0000 UTC m=+0.123365059 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:28:58 compute-0 nova_compute[192903]: 2025-10-06 14:28:58.415 2 DEBUG nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:28:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:58.550 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:28:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:28:58.550 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:28:58 compute-0 nova_compute[192903]: 2025-10-06 14:28:58.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:58 compute-0 nova_compute[192903]: 2025-10-06 14:28:58.674 2 DEBUG nova.network.neutron [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Successfully created port: 88a071f7-feca-46b3-92db-78710d8b027b _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.437 2 DEBUG nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.439 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.439 2 INFO nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Creating image(s)
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.440 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "/var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.441 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "/var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.442 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "/var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.443 2 DEBUG oslo_utils.imageutils.format_inspector [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.450 2 DEBUG oslo_utils.imageutils.format_inspector [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.453 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.531 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.533 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.534 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.534 2 DEBUG oslo_utils.imageutils.format_inspector [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.537 2 DEBUG oslo_utils.imageutils.format_inspector [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.538 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.616 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.617 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.669 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.670 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.670 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:28:59 compute-0 podman[203308]: time="2025-10-06T14:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:28:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:28:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.750 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.751 2 DEBUG nova.virt.disk.api [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Checking if we can resize image /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.751 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.816 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.817 2 DEBUG nova.virt.disk.api [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Cannot resize image /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.817 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.817 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Ensure instance console log exists: /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.818 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.818 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:28:59 compute-0 nova_compute[192903]: 2025-10-06 14:28:59.819 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:00 compute-0 nova_compute[192903]: 2025-10-06 14:29:00.682 2 DEBUG nova.network.neutron [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Successfully updated port: 88a071f7-feca-46b3-92db-78710d8b027b _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:29:00 compute-0 nova_compute[192903]: 2025-10-06 14:29:00.793 2 DEBUG nova.compute.manager [req-9dddbdda-f2ca-481a-ae4a-65e8e7c109eb req-9f54e6cf-042d-49a0-ba5e-3e629a617b6f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Received event network-changed-88a071f7-feca-46b3-92db-78710d8b027b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:29:00 compute-0 nova_compute[192903]: 2025-10-06 14:29:00.794 2 DEBUG nova.compute.manager [req-9dddbdda-f2ca-481a-ae4a-65e8e7c109eb req-9f54e6cf-042d-49a0-ba5e-3e629a617b6f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Refreshing instance network info cache due to event network-changed-88a071f7-feca-46b3-92db-78710d8b027b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:29:00 compute-0 nova_compute[192903]: 2025-10-06 14:29:00.794 2 DEBUG oslo_concurrency.lockutils [req-9dddbdda-f2ca-481a-ae4a-65e8e7c109eb req-9f54e6cf-042d-49a0-ba5e-3e629a617b6f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-97c347c0-834f-446c-8585-a132ba411853" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:29:00 compute-0 nova_compute[192903]: 2025-10-06 14:29:00.794 2 DEBUG oslo_concurrency.lockutils [req-9dddbdda-f2ca-481a-ae4a-65e8e7c109eb req-9f54e6cf-042d-49a0-ba5e-3e629a617b6f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-97c347c0-834f-446c-8585-a132ba411853" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:29:00 compute-0 nova_compute[192903]: 2025-10-06 14:29:00.794 2 DEBUG nova.network.neutron [req-9dddbdda-f2ca-481a-ae4a-65e8e7c109eb req-9f54e6cf-042d-49a0-ba5e-3e629a617b6f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Refreshing network info cache for port 88a071f7-feca-46b3-92db-78710d8b027b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:29:01 compute-0 nova_compute[192903]: 2025-10-06 14:29:01.189 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "refresh_cache-97c347c0-834f-446c-8585-a132ba411853" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:29:01 compute-0 nova_compute[192903]: 2025-10-06 14:29:01.300 2 WARNING neutronclient.v2_0.client [req-9dddbdda-f2ca-481a-ae4a-65e8e7c109eb req-9f54e6cf-042d-49a0-ba5e-3e629a617b6f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:29:01 compute-0 openstack_network_exporter[205500]: ERROR   14:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:29:01 compute-0 openstack_network_exporter[205500]: ERROR   14:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:29:01 compute-0 openstack_network_exporter[205500]: ERROR   14:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:29:01 compute-0 openstack_network_exporter[205500]: ERROR   14:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:29:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:29:01 compute-0 openstack_network_exporter[205500]: ERROR   14:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:29:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:29:01 compute-0 nova_compute[192903]: 2025-10-06 14:29:01.537 2 DEBUG nova.network.neutron [req-9dddbdda-f2ca-481a-ae4a-65e8e7c109eb req-9f54e6cf-042d-49a0-ba5e-3e629a617b6f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:29:01 compute-0 nova_compute[192903]: 2025-10-06 14:29:01.603 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:01 compute-0 nova_compute[192903]: 2025-10-06 14:29:01.705 2 DEBUG nova.network.neutron [req-9dddbdda-f2ca-481a-ae4a-65e8e7c109eb req-9f54e6cf-042d-49a0-ba5e-3e629a617b6f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:29:02 compute-0 nova_compute[192903]: 2025-10-06 14:29:02.218 2 DEBUG oslo_concurrency.lockutils [req-9dddbdda-f2ca-481a-ae4a-65e8e7c109eb req-9f54e6cf-042d-49a0-ba5e-3e629a617b6f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-97c347c0-834f-446c-8585-a132ba411853" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:29:02 compute-0 nova_compute[192903]: 2025-10-06 14:29:02.219 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquired lock "refresh_cache-97c347c0-834f-446c-8585-a132ba411853" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:29:02 compute-0 nova_compute[192903]: 2025-10-06 14:29:02.219 2 DEBUG nova.network.neutron [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:29:03 compute-0 nova_compute[192903]: 2025-10-06 14:29:03.534 2 DEBUG nova.network.neutron [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:29:03 compute-0 nova_compute[192903]: 2025-10-06 14:29:03.719 2 WARNING neutronclient.v2_0.client [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.399 2 DEBUG nova.network.neutron [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Updating instance_info_cache with network_info: [{"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.906 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Releasing lock "refresh_cache-97c347c0-834f-446c-8585-a132ba411853" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.906 2 DEBUG nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Instance network_info: |[{"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.908 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Start _get_guest_xml network_info=[{"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.914 2 WARNING nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.915 2 DEBUG nova.virt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1966725592', uuid='97c347c0-834f-446c-8585-a132ba411853'), owner=OwnerMeta(userid='056f3d4527be4c01acce85b1b5641775', username='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469-project-admin', projectid='3d54ae6464194adf9e6c766021f7d34d', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759760944.9155774) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.922 2 DEBUG nova.virt.libvirt.host [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.923 2 DEBUG nova.virt.libvirt.host [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.925 2 DEBUG nova.virt.libvirt.host [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.926 2 DEBUG nova.virt.libvirt.host [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.927 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.927 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.927 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.928 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.928 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.928 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.928 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.928 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.928 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.929 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.929 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.929 2 DEBUG nova.virt.hardware [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.933 2 DEBUG nova.virt.libvirt.vif [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:28:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1966725592',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1966725592',id=29,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d54ae6464194adf9e6c766021f7d34d',ramdisk_id='',reservation_id='r-ugfbr727',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:28:58Z,user_data=None,user_id='056f3d4527be4c01acce85b1b5641775',uuid=97c347c0-834f-446c-8585-a132ba411853,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.934 2 DEBUG nova.network.os_vif_util [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Converting VIF {"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.934 2 DEBUG nova.network.os_vif_util [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ee:84,bridge_name='br-int',has_traffic_filtering=True,id=88a071f7-feca-46b3-92db-78710d8b027b,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88a071f7-fe') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:29:04 compute-0 nova_compute[192903]: 2025-10-06 14:29:04.935 2 DEBUG nova.objects.instance [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lazy-loading 'pci_devices' on Instance uuid 97c347c0-834f-446c-8585-a132ba411853 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.443 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <uuid>97c347c0-834f-446c-8585-a132ba411853</uuid>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <name>instance-0000001d</name>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1966725592</nova:name>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:29:04</nova:creationTime>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:29:05 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:29:05 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:user uuid="056f3d4527be4c01acce85b1b5641775">tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469-project-admin</nova:user>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:project uuid="3d54ae6464194adf9e6c766021f7d34d">tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469</nova:project>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         <nova:port uuid="88a071f7-feca-46b3-92db-78710d8b027b">
Oct 06 14:29:05 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <system>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <entry name="serial">97c347c0-834f-446c-8585-a132ba411853</entry>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <entry name="uuid">97c347c0-834f-446c-8585-a132ba411853</entry>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     </system>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <os>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   </os>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <features>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   </features>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk.config"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:b9:ee:84"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <target dev="tap88a071f7-fe"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/console.log" append="off"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <video>
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     </video>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:29:05 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:29:05 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:29:05 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:29:05 compute-0 nova_compute[192903]: </domain>
Oct 06 14:29:05 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.445 2 DEBUG nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Preparing to wait for external event network-vif-plugged-88a071f7-feca-46b3-92db-78710d8b027b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.446 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "97c347c0-834f-446c-8585-a132ba411853-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.446 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.447 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.448 2 DEBUG nova.virt.libvirt.vif [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:28:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1966725592',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1966725592',id=29,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d54ae6464194adf9e6c766021f7d34d',ramdisk_id='',reservation_id='r-ugfbr727',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:28:58Z,user_data=None,user_id='056f3d4527be4c01acce85b1b5641775',uuid=97c347c0-834f-446c-8585-a132ba411853,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.448 2 DEBUG nova.network.os_vif_util [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Converting VIF {"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.449 2 DEBUG nova.network.os_vif_util [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ee:84,bridge_name='br-int',has_traffic_filtering=True,id=88a071f7-feca-46b3-92db-78710d8b027b,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88a071f7-fe') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.450 2 DEBUG os_vif [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ee:84,bridge_name='br-int',has_traffic_filtering=True,id=88a071f7-feca-46b3-92db-78710d8b027b,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88a071f7-fe') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.451 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.451 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.453 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '161bc55f-756b-5b26-8a56-a5a88930c568', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88a071f7-fe, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.464 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap88a071f7-fe, col_values=(('qos', UUID('750ffbd0-ee5c-4bad-81a0-fb84c9d2bc4f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.464 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap88a071f7-fe, col_values=(('external_ids', {'iface-id': '88a071f7-feca-46b3-92db-78710d8b027b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:ee:84', 'vm-uuid': '97c347c0-834f-446c-8585-a132ba411853'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:05 compute-0 NetworkManager[52035]: <info>  [1759760945.4669] manager: (tap88a071f7-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.472 2 INFO os_vif [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ee:84,bridge_name='br-int',has_traffic_filtering=True,id=88a071f7-feca-46b3-92db-78710d8b027b,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88a071f7-fe')
Oct 06 14:29:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:05.552 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:05 compute-0 nova_compute[192903]: 2025-10-06 14:29:05.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:06 compute-0 nova_compute[192903]: 2025-10-06 14:29:06.117 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:29:06 compute-0 nova_compute[192903]: 2025-10-06 14:29:06.118 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:29:06 compute-0 nova_compute[192903]: 2025-10-06 14:29:06.118 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:06 compute-0 nova_compute[192903]: 2025-10-06 14:29:06.118 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.020 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.020 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.020 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] No VIF found with MAC fa:16:3e:b9:ee:84, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.021 2 INFO nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Using config drive
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.149 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.208 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.209 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.267 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.269 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000001d, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk.config'
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.421 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.423 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.442 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.444 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5847MB free_disk=73.29977798461914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.444 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.445 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.531 2 WARNING neutronclient.v2_0.client [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.696 2 INFO nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Creating config drive at /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk.config
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.702 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpt29hfqzd execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.832 2 DEBUG oslo_concurrency.processutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpt29hfqzd" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:07 compute-0 kernel: tap88a071f7-fe: entered promiscuous mode
Oct 06 14:29:07 compute-0 NetworkManager[52035]: <info>  [1759760947.9131] manager: (tap88a071f7-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Oct 06 14:29:07 compute-0 ovn_controller[95205]: 2025-10-06T14:29:07Z|00260|binding|INFO|Claiming lport 88a071f7-feca-46b3-92db-78710d8b027b for this chassis.
Oct 06 14:29:07 compute-0 ovn_controller[95205]: 2025-10-06T14:29:07Z|00261|binding|INFO|88a071f7-feca-46b3-92db-78710d8b027b: Claiming fa:16:3e:b9:ee:84 10.100.0.3
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:07.942 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:ee:84 10.100.0.3'], port_security=['fa:16:3e:b9:ee:84 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '97c347c0-834f-446c-8585-a132ba411853', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d54ae6464194adf9e6c766021f7d34d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '14ab3d49-c7f0-4724-a78b-49797efce79d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c178903-2d5a-46e7-86ba-a026de5c67dd, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=88a071f7-feca-46b3-92db-78710d8b027b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:29:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:07.944 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 88a071f7-feca-46b3-92db-78710d8b027b in datapath a0424b4b-abad-433f-a1fd-549d3e8c60ac bound to our chassis
Oct 06 14:29:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:07.945 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a0424b4b-abad-433f-a1fd-549d3e8c60ac
Oct 06 14:29:07 compute-0 systemd-udevd[227764]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:29:07 compute-0 NetworkManager[52035]: <info>  [1759760947.9604] device (tap88a071f7-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:29:07 compute-0 NetworkManager[52035]: <info>  [1759760947.9616] device (tap88a071f7-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:29:07 compute-0 systemd-machined[152985]: New machine qemu-23-instance-0000001d.
Oct 06 14:29:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:07.965 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9c828c99-429c-4eb9-a0a1-878dc01882bb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:07.966 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa0424b4b-a1 in ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:29:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:07.968 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa0424b4b-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:29:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:07.968 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[16f28cc8-05c5-4c94-b272-47a135580fc0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:07.969 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[36f55119-e8c5-4539-83b1-f874afba54b6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:07 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000001d.
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:07.985 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[9fee5715-af33-454c-9f9e-9e29aa1e100c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:07 compute-0 ovn_controller[95205]: 2025-10-06T14:29:07Z|00262|binding|INFO|Setting lport 88a071f7-feca-46b3-92db-78710d8b027b ovn-installed in OVS
Oct 06 14:29:07 compute-0 ovn_controller[95205]: 2025-10-06T14:29:07Z|00263|binding|INFO|Setting lport 88a071f7-feca-46b3-92db-78710d8b027b up in Southbound
Oct 06 14:29:07 compute-0 nova_compute[192903]: 2025-10-06 14:29:07.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.003 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c9567da9-6029-46e7-a3ef-ef13999ee88e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.037 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[4578f933-3929-43d8-bd91-dc47dcab7348]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.043 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e88fd8b2-8cfd-48f2-be53-6a066bc36b72]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 NetworkManager[52035]: <info>  [1759760948.0439] manager: (tapa0424b4b-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Oct 06 14:29:08 compute-0 systemd-udevd[227768]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.085 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2cc2da-55c0-4657-ad00-106ecc6ebfec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.088 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[021c2efe-3b37-431c-83f2-cbc5b956b6c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 NetworkManager[52035]: <info>  [1759760948.1175] device (tapa0424b4b-a0): carrier: link connected
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.125 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[1be91f53-c2a9-4e58-9214-637e6436468b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.150 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7b379eb0-96e6-47e5-ae94-4bc05b68f441]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0424b4b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:1d:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540976, 'reachable_time': 36685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227798, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.173 2 DEBUG nova.compute.manager [req-606a0289-be15-4916-9783-cd6dd7566779 req-d7c82142-0fb8-4cd5-a780-8e6a9cc5e4f6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Received event network-vif-plugged-88a071f7-feca-46b3-92db-78710d8b027b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.174 2 DEBUG oslo_concurrency.lockutils [req-606a0289-be15-4916-9783-cd6dd7566779 req-d7c82142-0fb8-4cd5-a780-8e6a9cc5e4f6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "97c347c0-834f-446c-8585-a132ba411853-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.174 2 DEBUG oslo_concurrency.lockutils [req-606a0289-be15-4916-9783-cd6dd7566779 req-d7c82142-0fb8-4cd5-a780-8e6a9cc5e4f6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.175 2 DEBUG oslo_concurrency.lockutils [req-606a0289-be15-4916-9783-cd6dd7566779 req-d7c82142-0fb8-4cd5-a780-8e6a9cc5e4f6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.175 2 DEBUG nova.compute.manager [req-606a0289-be15-4916-9783-cd6dd7566779 req-d7c82142-0fb8-4cd5-a780-8e6a9cc5e4f6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Processing event network-vif-plugged-88a071f7-feca-46b3-92db-78710d8b027b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.176 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad9b87e-a6e8-4736-8a02-d1e805b2871d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:1d79'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540976, 'tstamp': 540976}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227799, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.203 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[30daf723-a024-49ea-9513-4408c29ad449]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0424b4b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:1d:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540976, 'reachable_time': 36685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227801, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.252 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[18607a12-fc8e-4558-9dcc-0f36f0497c30]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.326 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a6c316-c152-41f2-999c-4ef3f20c65c3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.328 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0424b4b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.328 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.328 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0424b4b-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:08 compute-0 NetworkManager[52035]: <info>  [1759760948.3305] manager: (tapa0424b4b-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:08 compute-0 kernel: tapa0424b4b-a0: entered promiscuous mode
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.333 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa0424b4b-a0, col_values=(('external_ids', {'iface-id': '8f8e5c8a-64ad-42f5-a8bc-4f5dd2e00cbe'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:08 compute-0 ovn_controller[95205]: 2025-10-06T14:29:08Z|00264|binding|INFO|Releasing lport 8f8e5c8a-64ad-42f5-a8bc-4f5dd2e00cbe from this chassis (sb_readonly=0)
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.337 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8ea5ea-016a-4994-bb56-6aa662e83045]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.338 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.338 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.339 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a0424b4b-abad-433f-a1fd-549d3e8c60ac disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.339 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.340 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[d99632da-33ab-4376-a595-3198bb73a216]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.341 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.341 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c89a0c10-79f8-46e3-b8eb-e76ec18ee31d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.342 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-a0424b4b-abad-433f-a1fd-549d3e8c60ac
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID a0424b4b-abad-433f-a1fd-549d3e8c60ac
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:29:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:08.344 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'env', 'PROCESS_TAG=haproxy-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a0424b4b-abad-433f-a1fd-549d3e8c60ac.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.511 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 97c347c0-834f-446c-8585-a132ba411853 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.511 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.512 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:29:07 up  1:30,  0 user,  load average: 0.08, 0.17, 0.27\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_3d54ae6464194adf9e6c766021f7d34d': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.543 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.728 2 DEBUG nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.732 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.736 2 INFO nova.virt.libvirt.driver [-] [instance: 97c347c0-834f-446c-8585-a132ba411853] Instance spawned successfully.
Oct 06 14:29:08 compute-0 nova_compute[192903]: 2025-10-06 14:29:08.736 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:29:08 compute-0 podman[227836]: 2025-10-06 14:29:08.803679606 +0000 UTC m=+0.057775269 container create 1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:29:08 compute-0 systemd[1]: Started libpod-conmon-1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d.scope.
Oct 06 14:29:08 compute-0 podman[227836]: 2025-10-06 14:29:08.773288126 +0000 UTC m=+0.027383799 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:29:08 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:29:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f504b3b313b6d0f864e875420e3d00240d4b08b2693d34ee1c8512df285f9677/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:29:08 compute-0 podman[227836]: 2025-10-06 14:29:08.894700151 +0000 UTC m=+0.148795814 container init 1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 06 14:29:08 compute-0 podman[227836]: 2025-10-06 14:29:08.900213832 +0000 UTC m=+0.154309475 container start 1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Oct 06 14:29:08 compute-0 neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac[227852]: [NOTICE]   (227874) : New worker (227877) forked
Oct 06 14:29:08 compute-0 neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac[227852]: [NOTICE]   (227874) : Loading success.
Oct 06 14:29:08 compute-0 podman[227849]: 2025-10-06 14:29:08.935984309 +0000 UTC m=+0.091200042 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.050 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.250 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.250 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.251 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.252 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.253 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.253 2 DEBUG nova.virt.libvirt.driver [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.562 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.562 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.769 2 INFO nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Took 10.33 seconds to spawn the instance on the hypervisor.
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.770 2 DEBUG nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:29:09 compute-0 nova_compute[192903]: 2025-10-06 14:29:09.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:10 compute-0 nova_compute[192903]: 2025-10-06 14:29:10.227 2 DEBUG nova.compute.manager [req-9c94e61b-6a47-4486-a74a-9e419c8dfd33 req-96d525d8-43a1-4fad-a77f-ff1749499e25 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Received event network-vif-plugged-88a071f7-feca-46b3-92db-78710d8b027b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:29:10 compute-0 nova_compute[192903]: 2025-10-06 14:29:10.228 2 DEBUG oslo_concurrency.lockutils [req-9c94e61b-6a47-4486-a74a-9e419c8dfd33 req-96d525d8-43a1-4fad-a77f-ff1749499e25 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "97c347c0-834f-446c-8585-a132ba411853-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:29:10 compute-0 nova_compute[192903]: 2025-10-06 14:29:10.229 2 DEBUG oslo_concurrency.lockutils [req-9c94e61b-6a47-4486-a74a-9e419c8dfd33 req-96d525d8-43a1-4fad-a77f-ff1749499e25 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:29:10 compute-0 nova_compute[192903]: 2025-10-06 14:29:10.229 2 DEBUG oslo_concurrency.lockutils [req-9c94e61b-6a47-4486-a74a-9e419c8dfd33 req-96d525d8-43a1-4fad-a77f-ff1749499e25 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:10 compute-0 nova_compute[192903]: 2025-10-06 14:29:10.229 2 DEBUG nova.compute.manager [req-9c94e61b-6a47-4486-a74a-9e419c8dfd33 req-96d525d8-43a1-4fad-a77f-ff1749499e25 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] No waiting events found dispatching network-vif-plugged-88a071f7-feca-46b3-92db-78710d8b027b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:29:10 compute-0 nova_compute[192903]: 2025-10-06 14:29:10.230 2 WARNING nova.compute.manager [req-9c94e61b-6a47-4486-a74a-9e419c8dfd33 req-96d525d8-43a1-4fad-a77f-ff1749499e25 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Received unexpected event network-vif-plugged-88a071f7-feca-46b3-92db-78710d8b027b for instance with vm_state active and task_state None.
Oct 06 14:29:10 compute-0 nova_compute[192903]: 2025-10-06 14:29:10.307 2 INFO nova.compute.manager [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Took 15.79 seconds to build instance.
Oct 06 14:29:10 compute-0 nova_compute[192903]: 2025-10-06 14:29:10.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:10 compute-0 nova_compute[192903]: 2025-10-06 14:29:10.812 2 DEBUG oslo_concurrency.lockutils [None req-93d28fd5-2887-4ff8-936b-aad78cae3f0a 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.307s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:11.405 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:29:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:11.405 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:29:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:11.405 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:12 compute-0 nova_compute[192903]: 2025-10-06 14:29:12.562 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:12 compute-0 nova_compute[192903]: 2025-10-06 14:29:12.563 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:12 compute-0 nova_compute[192903]: 2025-10-06 14:29:12.564 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:29:12 compute-0 nova_compute[192903]: 2025-10-06 14:29:12.579 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:13 compute-0 nova_compute[192903]: 2025-10-06 14:29:13.088 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:13 compute-0 nova_compute[192903]: 2025-10-06 14:29:13.088 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 14:29:13 compute-0 podman[227887]: 2025-10-06 14:29:13.230082609 +0000 UTC m=+0.090453741 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 06 14:29:13 compute-0 nova_compute[192903]: 2025-10-06 14:29:13.597 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 14:29:14 compute-0 nova_compute[192903]: 2025-10-06 14:29:14.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:15 compute-0 nova_compute[192903]: 2025-10-06 14:29:15.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:16 compute-0 nova_compute[192903]: 2025-10-06 14:29:16.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:18 compute-0 nova_compute[192903]: 2025-10-06 14:29:18.088 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:18 compute-0 nova_compute[192903]: 2025-10-06 14:29:18.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:19 compute-0 nova_compute[192903]: 2025-10-06 14:29:19.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:19 compute-0 nova_compute[192903]: 2025-10-06 14:29:19.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:20 compute-0 nova_compute[192903]: 2025-10-06 14:29:20.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:21 compute-0 ovn_controller[95205]: 2025-10-06T14:29:21Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b9:ee:84 10.100.0.3
Oct 06 14:29:21 compute-0 ovn_controller[95205]: 2025-10-06T14:29:21Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:ee:84 10.100.0.3
Oct 06 14:29:22 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 06 14:29:24 compute-0 nova_compute[192903]: 2025-10-06 14:29:24.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:25 compute-0 nova_compute[192903]: 2025-10-06 14:29:25.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:29 compute-0 podman[227924]: 2025-10-06 14:29:29.195071496 +0000 UTC m=+0.055519297 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 14:29:29 compute-0 podman[227923]: 2025-10-06 14:29:29.196908796 +0000 UTC m=+0.057330077 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 06 14:29:29 compute-0 podman[227925]: 2025-10-06 14:29:29.219588015 +0000 UTC m=+0.071018400 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:29:29 compute-0 podman[227922]: 2025-10-06 14:29:29.228080127 +0000 UTC m=+0.090150173 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 14:29:29 compute-0 podman[203308]: time="2025-10-06T14:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:29:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20763 "" "Go-http-client/1.1"
Oct 06 14:29:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3482 "" "Go-http-client/1.1"
Oct 06 14:29:29 compute-0 nova_compute[192903]: 2025-10-06 14:29:29.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:30 compute-0 nova_compute[192903]: 2025-10-06 14:29:30.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:31 compute-0 openstack_network_exporter[205500]: ERROR   14:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:29:31 compute-0 openstack_network_exporter[205500]: ERROR   14:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:29:31 compute-0 openstack_network_exporter[205500]: ERROR   14:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:29:31 compute-0 openstack_network_exporter[205500]: ERROR   14:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:29:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:29:31 compute-0 openstack_network_exporter[205500]: ERROR   14:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:29:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:29:31 compute-0 nova_compute[192903]: 2025-10-06 14:29:31.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:31 compute-0 nova_compute[192903]: 2025-10-06 14:29:31.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 14:29:34 compute-0 nova_compute[192903]: 2025-10-06 14:29:34.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:35 compute-0 nova_compute[192903]: 2025-10-06 14:29:35.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:38 compute-0 ovn_controller[95205]: 2025-10-06T14:29:38Z|00265|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Oct 06 14:29:39 compute-0 podman[228006]: 2025-10-06 14:29:39.203288261 +0000 UTC m=+0.062914038 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible)
Oct 06 14:29:39 compute-0 nova_compute[192903]: 2025-10-06 14:29:39.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:40 compute-0 nova_compute[192903]: 2025-10-06 14:29:40.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:40 compute-0 nova_compute[192903]: 2025-10-06 14:29:40.594 2 DEBUG nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Creating tmpfile /var/lib/nova/instances/tmp1vtb8b1g to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:29:40 compute-0 nova_compute[192903]: 2025-10-06 14:29:40.595 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:29:40 compute-0 nova_compute[192903]: 2025-10-06 14:29:40.698 2 DEBUG nova.compute.manager [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1vtb8b1g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:29:42 compute-0 nova_compute[192903]: 2025-10-06 14:29:42.141 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:29:42 compute-0 nova_compute[192903]: 2025-10-06 14:29:42.651 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Triggering sync for uuid 97c347c0-834f-446c-8585-a132ba411853 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Oct 06 14:29:42 compute-0 nova_compute[192903]: 2025-10-06 14:29:42.652 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "97c347c0-834f-446c-8585-a132ba411853" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:29:42 compute-0 nova_compute[192903]: 2025-10-06 14:29:42.653 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "97c347c0-834f-446c-8585-a132ba411853" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:29:42 compute-0 nova_compute[192903]: 2025-10-06 14:29:42.740 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:29:43 compute-0 nova_compute[192903]: 2025-10-06 14:29:43.163 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "97c347c0-834f-446c-8585-a132ba411853" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.510s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:44 compute-0 podman[228027]: 2025-10-06 14:29:44.24699864 +0000 UTC m=+0.102294304 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, release=1755695350, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6)
Oct 06 14:29:44 compute-0 nova_compute[192903]: 2025-10-06 14:29:44.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:45 compute-0 nova_compute[192903]: 2025-10-06 14:29:45.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:47 compute-0 nova_compute[192903]: 2025-10-06 14:29:47.133 2 DEBUG nova.compute.manager [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1vtb8b1g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='90c83b67-c2b6-49d6-a2e5-f025b87cd378',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:29:48 compute-0 nova_compute[192903]: 2025-10-06 14:29:48.148 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-90c83b67-c2b6-49d6-a2e5-f025b87cd378" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:29:48 compute-0 nova_compute[192903]: 2025-10-06 14:29:48.149 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-90c83b67-c2b6-49d6-a2e5-f025b87cd378" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:29:48 compute-0 nova_compute[192903]: 2025-10-06 14:29:48.149 2 DEBUG nova.network.neutron [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:29:48 compute-0 nova_compute[192903]: 2025-10-06 14:29:48.655 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:29:49 compute-0 nova_compute[192903]: 2025-10-06 14:29:49.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:49 compute-0 nova_compute[192903]: 2025-10-06 14:29:49.933 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:29:50 compute-0 nova_compute[192903]: 2025-10-06 14:29:50.099 2 DEBUG nova.network.neutron [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Updating instance_info_cache with network_info: [{"id": "0c91af92-7b04-42db-8657-8d0776baacb5", "address": "fa:16:3e:b7:23:ac", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c91af92-7b", "ovs_interfaceid": "0c91af92-7b04-42db-8657-8d0776baacb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:29:50 compute-0 nova_compute[192903]: 2025-10-06 14:29:50.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:50 compute-0 nova_compute[192903]: 2025-10-06 14:29:50.605 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-90c83b67-c2b6-49d6-a2e5-f025b87cd378" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:29:50 compute-0 nova_compute[192903]: 2025-10-06 14:29:50.619 2 DEBUG nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1vtb8b1g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='90c83b67-c2b6-49d6-a2e5-f025b87cd378',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:29:50 compute-0 nova_compute[192903]: 2025-10-06 14:29:50.620 2 DEBUG nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Creating instance directory: /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:29:50 compute-0 nova_compute[192903]: 2025-10-06 14:29:50.620 2 DEBUG nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Creating disk.info with the contents: {'/var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk': 'qcow2', '/var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:29:50 compute-0 nova_compute[192903]: 2025-10-06 14:29:50.621 2 DEBUG nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:29:50 compute-0 nova_compute[192903]: 2025-10-06 14:29:50.621 2 DEBUG nova.objects.instance [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid 90c83b67-c2b6-49d6-a2e5-f025b87cd378 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.127 2 DEBUG oslo_utils.imageutils.format_inspector [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.133 2 DEBUG oslo_utils.imageutils.format_inspector [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.135 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.230 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.231 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.232 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.232 2 DEBUG oslo_utils.imageutils.format_inspector [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.238 2 DEBUG oslo_utils.imageutils.format_inspector [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.238 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.305 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.306 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.342 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.343 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.344 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.400 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.401 2 DEBUG nova.virt.disk.api [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.401 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.461 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.463 2 DEBUG nova.virt.disk.api [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.464 2 DEBUG nova.objects.instance [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid 90c83b67-c2b6-49d6-a2e5-f025b87cd378 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.971 2 DEBUG nova.objects.base [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<90c83b67-c2b6-49d6-a2e5-f025b87cd378> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:29:51 compute-0 nova_compute[192903]: 2025-10-06 14:29:51.973 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.009 2 DEBUG oslo_concurrency.processutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk.config 497664" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.010 2 DEBUG nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.013 2 DEBUG nova.virt.libvirt.vif [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:28:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2144595250',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2144595250',id=28,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:28:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3d54ae6464194adf9e6c766021f7d34d',ramdisk_id='',reservation_id='r-77of7lm2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:28:49Z,user_data=None,user_id='056f3d4527be4c01acce85b1b5641775',uuid=90c83b67-c2b6-49d6-a2e5-f025b87cd378,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c91af92-7b04-42db-8657-8d0776baacb5", "address": "fa:16:3e:b7:23:ac", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0c91af92-7b", "ovs_interfaceid": "0c91af92-7b04-42db-8657-8d0776baacb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.013 2 DEBUG nova.network.os_vif_util [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "0c91af92-7b04-42db-8657-8d0776baacb5", "address": "fa:16:3e:b7:23:ac", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0c91af92-7b", "ovs_interfaceid": "0c91af92-7b04-42db-8657-8d0776baacb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.015 2 DEBUG nova.network.os_vif_util [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:23:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c91af92-7b04-42db-8657-8d0776baacb5,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c91af92-7b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.016 2 DEBUG os_vif [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:23:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c91af92-7b04-42db-8657-8d0776baacb5,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c91af92-7b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.017 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.018 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.020 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '311a5d22-8f0e-5ca0-bda3-d51986d4dab5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.028 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c91af92-7b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.029 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap0c91af92-7b, col_values=(('qos', UUID('515e112b-0858-41ff-b3ea-4d8fa94f4406')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.030 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap0c91af92-7b, col_values=(('external_ids', {'iface-id': '0c91af92-7b04-42db-8657-8d0776baacb5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:23:ac', 'vm-uuid': '90c83b67-c2b6-49d6-a2e5-f025b87cd378'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:52 compute-0 NetworkManager[52035]: <info>  [1759760992.0328] manager: (tap0c91af92-7b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.043 2 INFO os_vif [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:23:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c91af92-7b04-42db-8657-8d0776baacb5,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c91af92-7b')
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.043 2 DEBUG nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.044 2 DEBUG nova.compute.manager [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1vtb8b1g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='90c83b67-c2b6-49d6-a2e5-f025b87cd378',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.044 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:29:52 compute-0 nova_compute[192903]: 2025-10-06 14:29:52.559 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:29:53 compute-0 nova_compute[192903]: 2025-10-06 14:29:53.545 2 DEBUG nova.network.neutron [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Port 0c91af92-7b04-42db-8657-8d0776baacb5 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:29:53 compute-0 nova_compute[192903]: 2025-10-06 14:29:53.559 2 DEBUG nova.compute.manager [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1vtb8b1g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='90c83b67-c2b6-49d6-a2e5-f025b87cd378',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:29:54 compute-0 nova_compute[192903]: 2025-10-06 14:29:54.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:56 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 06 14:29:56 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 06 14:29:56 compute-0 NetworkManager[52035]: <info>  [1759760996.6334] manager: (tap0c91af92-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Oct 06 14:29:56 compute-0 kernel: tap0c91af92-7b: entered promiscuous mode
Oct 06 14:29:56 compute-0 nova_compute[192903]: 2025-10-06 14:29:56.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:56 compute-0 ovn_controller[95205]: 2025-10-06T14:29:56Z|00266|binding|INFO|Claiming lport 0c91af92-7b04-42db-8657-8d0776baacb5 for this additional chassis.
Oct 06 14:29:56 compute-0 ovn_controller[95205]: 2025-10-06T14:29:56Z|00267|binding|INFO|0c91af92-7b04-42db-8657-8d0776baacb5: Claiming fa:16:3e:b7:23:ac 10.100.0.7
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.645 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:23:ac 10.100.0.7'], port_security=['fa:16:3e:b7:23:ac 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '90c83b67-c2b6-49d6-a2e5-f025b87cd378', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d54ae6464194adf9e6c766021f7d34d', 'neutron:revision_number': '10', 'neutron:security_group_ids': '14ab3d49-c7f0-4724-a78b-49797efce79d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c178903-2d5a-46e7-86ba-a026de5c67dd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=0c91af92-7b04-42db-8657-8d0776baacb5) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.646 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 0c91af92-7b04-42db-8657-8d0776baacb5 in datapath a0424b4b-abad-433f-a1fd-549d3e8c60ac unbound from our chassis
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.648 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a0424b4b-abad-433f-a1fd-549d3e8c60ac
Oct 06 14:29:56 compute-0 ovn_controller[95205]: 2025-10-06T14:29:56Z|00268|binding|INFO|Setting lport 0c91af92-7b04-42db-8657-8d0776baacb5 ovn-installed in OVS
Oct 06 14:29:56 compute-0 nova_compute[192903]: 2025-10-06 14:29:56.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:56 compute-0 nova_compute[192903]: 2025-10-06 14:29:56.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:56 compute-0 nova_compute[192903]: 2025-10-06 14:29:56.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.668 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[db288e34-2a90-4940-9d89-c213fbf13dc8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:56 compute-0 systemd-machined[152985]: New machine qemu-24-instance-0000001c.
Oct 06 14:29:56 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-0000001c.
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.705 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[92048050-482a-4a0f-8dd8-7f6ecf935691]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.708 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[423858f7-5a8e-497d-b972-99185e566165]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:56 compute-0 systemd-udevd[228103]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:29:56 compute-0 NetworkManager[52035]: <info>  [1759760996.7401] device (tap0c91af92-7b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:29:56 compute-0 NetworkManager[52035]: <info>  [1759760996.7415] device (tap0c91af92-7b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.753 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[cba1746b-bcd0-458b-a138-2dc9df98b0d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.776 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2fbe9b56-d999-42cb-84ce-89d7a4777e05]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0424b4b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:1d:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540976, 'reachable_time': 36685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228113, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.799 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3661a2bb-959d-4a44-bcdc-abb512ba43c8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa0424b4b-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540992, 'tstamp': 540992}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228114, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa0424b4b-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540996, 'tstamp': 540996}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228114, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.800 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0424b4b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:56 compute-0 nova_compute[192903]: 2025-10-06 14:29:56.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:56 compute-0 nova_compute[192903]: 2025-10-06 14:29:56.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.804 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0424b4b-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.804 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.805 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa0424b4b-a0, col_values=(('external_ids', {'iface-id': '8f8e5c8a-64ad-42f5-a8bc-4f5dd2e00cbe'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.805 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:29:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:29:56.808 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1166f70e-3d94-48f2-9395-e2f2c57bd9a8]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a0424b4b-abad-433f-a1fd-549d3e8c60ac\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a0424b4b-abad-433f-a1fd-549d3e8c60ac\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:29:57 compute-0 nova_compute[192903]: 2025-10-06 14:29:57.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:29:59 compute-0 podman[203308]: time="2025-10-06T14:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:29:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20763 "" "Go-http-client/1.1"
Oct 06 14:29:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Oct 06 14:29:59 compute-0 nova_compute[192903]: 2025-10-06 14:29:59.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:00 compute-0 podman[228140]: 2025-10-06 14:30:00.195842166 +0000 UTC m=+0.049138163 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 06 14:30:00 compute-0 podman[228139]: 2025-10-06 14:30:00.202003514 +0000 UTC m=+0.062505748 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd)
Oct 06 14:30:00 compute-0 podman[228141]: 2025-10-06 14:30:00.221343132 +0000 UTC m=+0.074317330 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:30:00 compute-0 podman[228138]: 2025-10-06 14:30:00.247777454 +0000 UTC m=+0.108321589 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 06 14:30:01 compute-0 openstack_network_exporter[205500]: ERROR   14:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:30:01 compute-0 openstack_network_exporter[205500]: ERROR   14:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:30:01 compute-0 openstack_network_exporter[205500]: ERROR   14:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:30:01 compute-0 openstack_network_exporter[205500]: ERROR   14:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:30:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:30:01 compute-0 openstack_network_exporter[205500]: ERROR   14:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:30:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:30:01 compute-0 nova_compute[192903]: 2025-10-06 14:30:01.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:01 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:01.806 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:30:01 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:01.807 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:30:01 compute-0 ovn_controller[95205]: 2025-10-06T14:30:01Z|00269|binding|INFO|Claiming lport 0c91af92-7b04-42db-8657-8d0776baacb5 for this chassis.
Oct 06 14:30:01 compute-0 ovn_controller[95205]: 2025-10-06T14:30:01Z|00270|binding|INFO|0c91af92-7b04-42db-8657-8d0776baacb5: Claiming fa:16:3e:b7:23:ac 10.100.0.7
Oct 06 14:30:01 compute-0 ovn_controller[95205]: 2025-10-06T14:30:01Z|00271|binding|INFO|Setting lport 0c91af92-7b04-42db-8657-8d0776baacb5 up in Southbound
Oct 06 14:30:02 compute-0 nova_compute[192903]: 2025-10-06 14:30:02.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:02 compute-0 nova_compute[192903]: 2025-10-06 14:30:02.094 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:30:02 compute-0 nova_compute[192903]: 2025-10-06 14:30:02.941 2 INFO nova.compute.manager [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Post operation of migration started
Oct 06 14:30:02 compute-0 nova_compute[192903]: 2025-10-06 14:30:02.942 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:03 compute-0 nova_compute[192903]: 2025-10-06 14:30:03.563 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:03 compute-0 nova_compute[192903]: 2025-10-06 14:30:03.564 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:03 compute-0 nova_compute[192903]: 2025-10-06 14:30:03.677 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-90c83b67-c2b6-49d6-a2e5-f025b87cd378" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:30:03 compute-0 nova_compute[192903]: 2025-10-06 14:30:03.678 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-90c83b67-c2b6-49d6-a2e5-f025b87cd378" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:30:03 compute-0 nova_compute[192903]: 2025-10-06 14:30:03.678 2 DEBUG nova.network.neutron [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:30:04 compute-0 nova_compute[192903]: 2025-10-06 14:30:04.191 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:04 compute-0 nova_compute[192903]: 2025-10-06 14:30:04.662 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:04 compute-0 nova_compute[192903]: 2025-10-06 14:30:04.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:04 compute-0 nova_compute[192903]: 2025-10-06 14:30:04.807 2 DEBUG nova.network.neutron [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Updating instance_info_cache with network_info: [{"id": "0c91af92-7b04-42db-8657-8d0776baacb5", "address": "fa:16:3e:b7:23:ac", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c91af92-7b", "ovs_interfaceid": "0c91af92-7b04-42db-8657-8d0776baacb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:30:04 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:04.809 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:30:05 compute-0 nova_compute[192903]: 2025-10-06 14:30:05.319 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-90c83b67-c2b6-49d6-a2e5-f025b87cd378" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:30:05 compute-0 nova_compute[192903]: 2025-10-06 14:30:05.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:30:05 compute-0 nova_compute[192903]: 2025-10-06 14:30:05.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:30:05 compute-0 nova_compute[192903]: 2025-10-06 14:30:05.840 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:05 compute-0 nova_compute[192903]: 2025-10-06 14:30:05.841 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:05 compute-0 nova_compute[192903]: 2025-10-06 14:30:05.841 2 DEBUG oslo_concurrency.lockutils [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:05 compute-0 nova_compute[192903]: 2025-10-06 14:30:05.845 2 INFO nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:30:05 compute-0 virtqemud[192802]: Domain id=24 name='instance-0000001c' uuid=90c83b67-c2b6-49d6-a2e5-f025b87cd378 is tainted: custom-monitor
Oct 06 14:30:06 compute-0 nova_compute[192903]: 2025-10-06 14:30:06.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:06 compute-0 nova_compute[192903]: 2025-10-06 14:30:06.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:06 compute-0 nova_compute[192903]: 2025-10-06 14:30:06.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:06 compute-0 nova_compute[192903]: 2025-10-06 14:30:06.096 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:30:06 compute-0 nova_compute[192903]: 2025-10-06 14:30:06.853 2 INFO nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.150 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.205 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.206 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.297 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.303 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.354 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.355 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.414 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.546 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.547 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.565 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.566 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5481MB free_disk=73.2422866821289GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.566 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.566 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.859 2 INFO nova.virt.libvirt.driver [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:30:07 compute-0 nova_compute[192903]: 2025-10-06 14:30:07.865 2 DEBUG nova.compute.manager [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:30:08 compute-0 nova_compute[192903]: 2025-10-06 14:30:08.373 2 DEBUG nova.objects.instance [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:30:08 compute-0 nova_compute[192903]: 2025-10-06 14:30:08.586 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Migration for instance 90c83b67-c2b6-49d6-a2e5-f025b87cd378 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 06 14:30:09 compute-0 nova_compute[192903]: 2025-10-06 14:30:09.093 2 INFO nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Updating resource usage from migration 6dfcc00a-d2b3-41ac-9442-0d04d447bd7e
Oct 06 14:30:09 compute-0 nova_compute[192903]: 2025-10-06 14:30:09.094 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Starting to track incoming migration 6dfcc00a-d2b3-41ac-9442-0d04d447bd7e with flavor 8cb06c85-e9e7-417f-906b-1f7cf29f7de9 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 06 14:30:09 compute-0 nova_compute[192903]: 2025-10-06 14:30:09.392 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:09 compute-0 nova_compute[192903]: 2025-10-06 14:30:09.559 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:09 compute-0 nova_compute[192903]: 2025-10-06 14:30:09.560 2 WARNING neutronclient.v2_0.client [None req-220a472a-4718-4509-bb5f-84b64b0109ff f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:09 compute-0 nova_compute[192903]: 2025-10-06 14:30:09.641 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 97c347c0-834f-446c-8585-a132ba411853 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:30:09 compute-0 nova_compute[192903]: 2025-10-06 14:30:09.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:10 compute-0 nova_compute[192903]: 2025-10-06 14:30:10.147 2 WARNING nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 90c83b67-c2b6-49d6-a2e5-f025b87cd378 is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.
Oct 06 14:30:10 compute-0 nova_compute[192903]: 2025-10-06 14:30:10.148 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:30:10 compute-0 nova_compute[192903]: 2025-10-06 14:30:10.148 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:30:07 up  1:31,  0 user,  load average: 0.28, 0.21, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3d54ae6464194adf9e6c766021f7d34d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:30:10 compute-0 podman[228238]: 2025-10-06 14:30:10.208126194 +0000 UTC m=+0.060818952 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:30:10 compute-0 nova_compute[192903]: 2025-10-06 14:30:10.266 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:30:10 compute-0 nova_compute[192903]: 2025-10-06 14:30:10.774 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:30:11 compute-0 nova_compute[192903]: 2025-10-06 14:30:11.286 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:30:11 compute-0 nova_compute[192903]: 2025-10-06 14:30:11.286 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.720s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:11.408 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:11.408 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:11.409 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:12 compute-0 nova_compute[192903]: 2025-10-06 14:30:12.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:14 compute-0 nova_compute[192903]: 2025-10-06 14:30:14.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.013 2 DEBUG oslo_concurrency.lockutils [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "97c347c0-834f-446c-8585-a132ba411853" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.014 2 DEBUG oslo_concurrency.lockutils [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.014 2 DEBUG oslo_concurrency.lockutils [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "97c347c0-834f-446c-8585-a132ba411853-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.014 2 DEBUG oslo_concurrency.lockutils [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.014 2 DEBUG oslo_concurrency.lockutils [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.027 2 INFO nova.compute.manager [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Terminating instance
Oct 06 14:30:15 compute-0 podman[228261]: 2025-10-06 14:30:15.216305913 +0000 UTC m=+0.087201892 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.545 2 DEBUG nova.compute.manager [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:30:15 compute-0 kernel: tap88a071f7-fe (unregistering): left promiscuous mode
Oct 06 14:30:15 compute-0 NetworkManager[52035]: <info>  [1759761015.5732] device (tap88a071f7-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:30:15 compute-0 ovn_controller[95205]: 2025-10-06T14:30:15Z|00272|binding|INFO|Releasing lport 88a071f7-feca-46b3-92db-78710d8b027b from this chassis (sb_readonly=0)
Oct 06 14:30:15 compute-0 ovn_controller[95205]: 2025-10-06T14:30:15Z|00273|binding|INFO|Setting lport 88a071f7-feca-46b3-92db-78710d8b027b down in Southbound
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:15 compute-0 ovn_controller[95205]: 2025-10-06T14:30:15Z|00274|binding|INFO|Removing iface tap88a071f7-fe ovn-installed in OVS
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.615 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:ee:84 10.100.0.3'], port_security=['fa:16:3e:b9:ee:84 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '97c347c0-834f-446c-8585-a132ba411853', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d54ae6464194adf9e6c766021f7d34d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '14ab3d49-c7f0-4724-a78b-49797efce79d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c178903-2d5a-46e7-86ba-a026de5c67dd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=88a071f7-feca-46b3-92db-78710d8b027b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.616 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 88a071f7-feca-46b3-92db-78710d8b027b in datapath a0424b4b-abad-433f-a1fd-549d3e8c60ac unbound from our chassis
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.617 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a0424b4b-abad-433f-a1fd-549d3e8c60ac
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.640 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[16cc7c8c-3ff3-40cf-809b-4c1397488487]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.671 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[476d35b0-5746-48dd-8973-5e1dd3d76133]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.673 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[fa45c725-a30c-47aa-b6e4-5a449519c267]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:15 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Oct 06 14:30:15 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001d.scope: Consumed 15.060s CPU time.
Oct 06 14:30:15 compute-0 systemd-machined[152985]: Machine qemu-23-instance-0000001d terminated.
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.701 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[289cece5-f5b8-44d2-a7f1-0f11ba510174]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.721 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2159caea-30f4-4c0c-93d3-27eb38a706fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0424b4b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:1d:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540976, 'reachable_time': 20490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228293, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.744 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bd471de9-20ce-4116-b2f8-cf498123215c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa0424b4b-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540992, 'tstamp': 540992}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228294, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa0424b4b-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540996, 'tstamp': 540996}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228294, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.745 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0424b4b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.751 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0424b4b-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.751 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.751 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa0424b4b-a0, col_values=(('external_ids', {'iface-id': '8f8e5c8a-64ad-42f5-a8bc-4f5dd2e00cbe'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.751 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:30:15 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:15.754 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[15d64d60-59f3-4464-80bc-f7edc23d7dd7]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a0424b4b-abad-433f-a1fd-549d3e8c60ac\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a0424b4b-abad-433f-a1fd-549d3e8c60ac\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.820 2 INFO nova.virt.libvirt.driver [-] [instance: 97c347c0-834f-446c-8585-a132ba411853] Instance destroyed successfully.
Oct 06 14:30:15 compute-0 nova_compute[192903]: 2025-10-06 14:30:15.820 2 DEBUG nova.objects.instance [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lazy-loading 'resources' on Instance uuid 97c347c0-834f-446c-8585-a132ba411853 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.287 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.288 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.288 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.328 2 DEBUG nova.virt.libvirt.vif [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:28:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1966725592',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1966725592',id=29,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:29:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d54ae6464194adf9e6c766021f7d34d',ramdisk_id='',reservation_id='r-ugfbr727',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:29:09Z,user_data=None,user_id='056f3d4527be4c01acce85b1b5641775',uuid=97c347c0-834f-446c-8585-a132ba411853,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.328 2 DEBUG nova.network.os_vif_util [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Converting VIF {"id": "88a071f7-feca-46b3-92db-78710d8b027b", "address": "fa:16:3e:b9:ee:84", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88a071f7-fe", "ovs_interfaceid": "88a071f7-feca-46b3-92db-78710d8b027b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.329 2 DEBUG nova.network.os_vif_util [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ee:84,bridge_name='br-int',has_traffic_filtering=True,id=88a071f7-feca-46b3-92db-78710d8b027b,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88a071f7-fe') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.330 2 DEBUG os_vif [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ee:84,bridge_name='br-int',has_traffic_filtering=True,id=88a071f7-feca-46b3-92db-78710d8b027b,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88a071f7-fe') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.336 2 DEBUG nova.compute.manager [req-a5c5627b-9df5-4fcf-9297-deb9cf7dd050 req-b98b2dae-775b-4bf4-803e-874aa2a50877 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Received event network-vif-unplugged-88a071f7-feca-46b3-92db-78710d8b027b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.337 2 DEBUG oslo_concurrency.lockutils [req-a5c5627b-9df5-4fcf-9297-deb9cf7dd050 req-b98b2dae-775b-4bf4-803e-874aa2a50877 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "97c347c0-834f-446c-8585-a132ba411853-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.338 2 DEBUG oslo_concurrency.lockutils [req-a5c5627b-9df5-4fcf-9297-deb9cf7dd050 req-b98b2dae-775b-4bf4-803e-874aa2a50877 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.338 2 DEBUG oslo_concurrency.lockutils [req-a5c5627b-9df5-4fcf-9297-deb9cf7dd050 req-b98b2dae-775b-4bf4-803e-874aa2a50877 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.339 2 DEBUG nova.compute.manager [req-a5c5627b-9df5-4fcf-9297-deb9cf7dd050 req-b98b2dae-775b-4bf4-803e-874aa2a50877 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] No waiting events found dispatching network-vif-unplugged-88a071f7-feca-46b3-92db-78710d8b027b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.339 2 DEBUG nova.compute.manager [req-a5c5627b-9df5-4fcf-9297-deb9cf7dd050 req-b98b2dae-775b-4bf4-803e-874aa2a50877 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Received event network-vif-unplugged-88a071f7-feca-46b3-92db-78710d8b027b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.341 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88a071f7-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.346 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=750ffbd0-ee5c-4bad-81a0-fb84c9d2bc4f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.351 2 INFO os_vif [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:ee:84,bridge_name='br-int',has_traffic_filtering=True,id=88a071f7-feca-46b3-92db-78710d8b027b,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88a071f7-fe')
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.352 2 INFO nova.virt.libvirt.driver [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Deleting instance files /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853_del
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.353 2 INFO nova.virt.libvirt.driver [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Deletion of /var/lib/nova/instances/97c347c0-834f-446c-8585-a132ba411853_del complete
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.862 2 INFO nova.compute.manager [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.862 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.862 2 DEBUG nova.compute.manager [-] [instance: 97c347c0-834f-446c-8585-a132ba411853] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.862 2 DEBUG nova.network.neutron [-] [instance: 97c347c0-834f-446c-8585-a132ba411853] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:30:16 compute-0 nova_compute[192903]: 2025-10-06 14:30:16.862 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:17 compute-0 nova_compute[192903]: 2025-10-06 14:30:17.319 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:17 compute-0 nova_compute[192903]: 2025-10-06 14:30:17.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:30:17 compute-0 nova_compute[192903]: 2025-10-06 14:30:17.863 2 DEBUG nova.compute.manager [req-a379c273-bef8-40f9-8156-950e5ed74ec1 req-2deb6b11-e06f-4677-b903-5d9c60d8a3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Received event network-vif-deleted-88a071f7-feca-46b3-92db-78710d8b027b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:30:17 compute-0 nova_compute[192903]: 2025-10-06 14:30:17.864 2 INFO nova.compute.manager [req-a379c273-bef8-40f9-8156-950e5ed74ec1 req-2deb6b11-e06f-4677-b903-5d9c60d8a3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Neutron deleted interface 88a071f7-feca-46b3-92db-78710d8b027b; detaching it from the instance and deleting it from the info cache
Oct 06 14:30:17 compute-0 nova_compute[192903]: 2025-10-06 14:30:17.864 2 DEBUG nova.network.neutron [req-a379c273-bef8-40f9-8156-950e5ed74ec1 req-2deb6b11-e06f-4677-b903-5d9c60d8a3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:30:18 compute-0 nova_compute[192903]: 2025-10-06 14:30:18.286 2 DEBUG nova.network.neutron [-] [instance: 97c347c0-834f-446c-8585-a132ba411853] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:30:18 compute-0 nova_compute[192903]: 2025-10-06 14:30:18.372 2 DEBUG nova.compute.manager [req-a379c273-bef8-40f9-8156-950e5ed74ec1 req-2deb6b11-e06f-4677-b903-5d9c60d8a3c8 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Detach interface failed, port_id=88a071f7-feca-46b3-92db-78710d8b027b, reason: Instance 97c347c0-834f-446c-8585-a132ba411853 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:30:18 compute-0 nova_compute[192903]: 2025-10-06 14:30:18.396 2 DEBUG nova.compute.manager [req-b1894b1f-eeba-4974-9263-38aa4773b6a3 req-377bf324-9002-4ea1-b253-fc8dc7395fc9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Received event network-vif-unplugged-88a071f7-feca-46b3-92db-78710d8b027b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:30:18 compute-0 nova_compute[192903]: 2025-10-06 14:30:18.396 2 DEBUG oslo_concurrency.lockutils [req-b1894b1f-eeba-4974-9263-38aa4773b6a3 req-377bf324-9002-4ea1-b253-fc8dc7395fc9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "97c347c0-834f-446c-8585-a132ba411853-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:18 compute-0 nova_compute[192903]: 2025-10-06 14:30:18.397 2 DEBUG oslo_concurrency.lockutils [req-b1894b1f-eeba-4974-9263-38aa4773b6a3 req-377bf324-9002-4ea1-b253-fc8dc7395fc9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:18 compute-0 nova_compute[192903]: 2025-10-06 14:30:18.398 2 DEBUG oslo_concurrency.lockutils [req-b1894b1f-eeba-4974-9263-38aa4773b6a3 req-377bf324-9002-4ea1-b253-fc8dc7395fc9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:18 compute-0 nova_compute[192903]: 2025-10-06 14:30:18.398 2 DEBUG nova.compute.manager [req-b1894b1f-eeba-4974-9263-38aa4773b6a3 req-377bf324-9002-4ea1-b253-fc8dc7395fc9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] No waiting events found dispatching network-vif-unplugged-88a071f7-feca-46b3-92db-78710d8b027b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:30:18 compute-0 nova_compute[192903]: 2025-10-06 14:30:18.399 2 DEBUG nova.compute.manager [req-b1894b1f-eeba-4974-9263-38aa4773b6a3 req-377bf324-9002-4ea1-b253-fc8dc7395fc9 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 97c347c0-834f-446c-8585-a132ba411853] Received event network-vif-unplugged-88a071f7-feca-46b3-92db-78710d8b027b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:30:18 compute-0 nova_compute[192903]: 2025-10-06 14:30:18.793 2 INFO nova.compute.manager [-] [instance: 97c347c0-834f-446c-8585-a132ba411853] Took 1.93 seconds to deallocate network for instance.
Oct 06 14:30:19 compute-0 nova_compute[192903]: 2025-10-06 14:30:19.317 2 DEBUG oslo_concurrency.lockutils [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:19 compute-0 nova_compute[192903]: 2025-10-06 14:30:19.318 2 DEBUG oslo_concurrency.lockutils [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:19 compute-0 nova_compute[192903]: 2025-10-06 14:30:19.386 2 DEBUG nova.compute.provider_tree [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:30:19 compute-0 nova_compute[192903]: 2025-10-06 14:30:19.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:19 compute-0 nova_compute[192903]: 2025-10-06 14:30:19.896 2 DEBUG nova.scheduler.client.report [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:30:20 compute-0 nova_compute[192903]: 2025-10-06 14:30:20.422 2 DEBUG oslo_concurrency.lockutils [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:20 compute-0 nova_compute[192903]: 2025-10-06 14:30:20.468 2 INFO nova.scheduler.client.report [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Deleted allocations for instance 97c347c0-834f-446c-8585-a132ba411853
Oct 06 14:30:20 compute-0 nova_compute[192903]: 2025-10-06 14:30:20.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:30:21 compute-0 nova_compute[192903]: 2025-10-06 14:30:21.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:21 compute-0 nova_compute[192903]: 2025-10-06 14:30:21.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:30:21 compute-0 nova_compute[192903]: 2025-10-06 14:30:21.878 2 DEBUG oslo_concurrency.lockutils [None req-9e6d997b-523a-4237-9b2a-bda39590abf8 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "97c347c0-834f-446c-8585-a132ba411853" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.864s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:24 compute-0 nova_compute[192903]: 2025-10-06 14:30:24.796 2 DEBUG oslo_concurrency.lockutils [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:24 compute-0 nova_compute[192903]: 2025-10-06 14:30:24.797 2 DEBUG oslo_concurrency.lockutils [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:24 compute-0 nova_compute[192903]: 2025-10-06 14:30:24.798 2 DEBUG oslo_concurrency.lockutils [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:24 compute-0 nova_compute[192903]: 2025-10-06 14:30:24.798 2 DEBUG oslo_concurrency.lockutils [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:24 compute-0 nova_compute[192903]: 2025-10-06 14:30:24.799 2 DEBUG oslo_concurrency.lockutils [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:24 compute-0 nova_compute[192903]: 2025-10-06 14:30:24.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:26 compute-0 nova_compute[192903]: 2025-10-06 14:30:26.185 2 INFO nova.compute.manager [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Terminating instance
Oct 06 14:30:26 compute-0 nova_compute[192903]: 2025-10-06 14:30:26.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:27 compute-0 nova_compute[192903]: 2025-10-06 14:30:27.861 2 DEBUG nova.compute.manager [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:30:27 compute-0 kernel: tap0c91af92-7b (unregistering): left promiscuous mode
Oct 06 14:30:27 compute-0 NetworkManager[52035]: <info>  [1759761027.8895] device (tap0c91af92-7b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:30:27 compute-0 nova_compute[192903]: 2025-10-06 14:30:27.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:27 compute-0 ovn_controller[95205]: 2025-10-06T14:30:27Z|00275|binding|INFO|Releasing lport 0c91af92-7b04-42db-8657-8d0776baacb5 from this chassis (sb_readonly=0)
Oct 06 14:30:27 compute-0 ovn_controller[95205]: 2025-10-06T14:30:27Z|00276|binding|INFO|Setting lport 0c91af92-7b04-42db-8657-8d0776baacb5 down in Southbound
Oct 06 14:30:27 compute-0 ovn_controller[95205]: 2025-10-06T14:30:27Z|00277|binding|INFO|Removing iface tap0c91af92-7b ovn-installed in OVS
Oct 06 14:30:27 compute-0 nova_compute[192903]: 2025-10-06 14:30:27.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:27.912 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:23:ac 10.100.0.7'], port_security=['fa:16:3e:b7:23:ac 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '90c83b67-c2b6-49d6-a2e5-f025b87cd378', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d54ae6464194adf9e6c766021f7d34d', 'neutron:revision_number': '14', 'neutron:security_group_ids': '14ab3d49-c7f0-4724-a78b-49797efce79d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c178903-2d5a-46e7-86ba-a026de5c67dd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=0c91af92-7b04-42db-8657-8d0776baacb5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:30:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:27.914 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 0c91af92-7b04-42db-8657-8d0776baacb5 in datapath a0424b4b-abad-433f-a1fd-549d3e8c60ac unbound from our chassis
Oct 06 14:30:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:27.917 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a0424b4b-abad-433f-a1fd-549d3e8c60ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:30:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:27.917 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9cc3ec-8ddb-45d3-b971-e2204c4b161c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:27 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:27.918 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac namespace which is not needed anymore
Oct 06 14:30:27 compute-0 nova_compute[192903]: 2025-10-06 14:30:27.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:27 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Oct 06 14:30:27 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001c.scope: Consumed 3.345s CPU time.
Oct 06 14:30:27 compute-0 systemd-machined[152985]: Machine qemu-24-instance-0000001c terminated.
Oct 06 14:30:28 compute-0 neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac[227852]: [NOTICE]   (227874) : haproxy version is 3.0.5-8e879a5
Oct 06 14:30:28 compute-0 neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac[227852]: [NOTICE]   (227874) : path to executable is /usr/sbin/haproxy
Oct 06 14:30:28 compute-0 neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac[227852]: [WARNING]  (227874) : Exiting Master process...
Oct 06 14:30:28 compute-0 podman[228338]: 2025-10-06 14:30:28.07905074 +0000 UTC m=+0.038534883 container kill 1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 06 14:30:28 compute-0 neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac[227852]: [ALERT]    (227874) : Current worker (227877) exited with code 143 (Terminated)
Oct 06 14:30:28 compute-0 neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac[227852]: [WARNING]  (227874) : All workers exited. Exiting... (0)
Oct 06 14:30:28 compute-0 systemd[1]: libpod-1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d.scope: Deactivated successfully.
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.103 2 DEBUG nova.compute.manager [req-f173ca90-d132-46e0-9eaa-056cb9635009 req-867d755f-b4bc-463a-a026-5e2618a3eca6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Received event network-vif-unplugged-0c91af92-7b04-42db-8657-8d0776baacb5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.104 2 DEBUG oslo_concurrency.lockutils [req-f173ca90-d132-46e0-9eaa-056cb9635009 req-867d755f-b4bc-463a-a026-5e2618a3eca6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.104 2 DEBUG oslo_concurrency.lockutils [req-f173ca90-d132-46e0-9eaa-056cb9635009 req-867d755f-b4bc-463a-a026-5e2618a3eca6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.105 2 DEBUG oslo_concurrency.lockutils [req-f173ca90-d132-46e0-9eaa-056cb9635009 req-867d755f-b4bc-463a-a026-5e2618a3eca6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.105 2 DEBUG nova.compute.manager [req-f173ca90-d132-46e0-9eaa-056cb9635009 req-867d755f-b4bc-463a-a026-5e2618a3eca6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] No waiting events found dispatching network-vif-unplugged-0c91af92-7b04-42db-8657-8d0776baacb5 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.106 2 DEBUG nova.compute.manager [req-f173ca90-d132-46e0-9eaa-056cb9635009 req-867d755f-b4bc-463a-a026-5e2618a3eca6 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Received event network-vif-unplugged-0c91af92-7b04-42db-8657-8d0776baacb5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:30:28 compute-0 podman[228357]: 2025-10-06 14:30:28.128099727 +0000 UTC m=+0.028752488 container died 1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.141 2 INFO nova.virt.libvirt.driver [-] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Instance destroyed successfully.
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.142 2 DEBUG nova.objects.instance [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lazy-loading 'resources' on Instance uuid 90c83b67-c2b6-49d6-a2e5-f025b87cd378 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:30:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-f504b3b313b6d0f864e875420e3d00240d4b08b2693d34ee1c8512df285f9677-merged.mount: Deactivated successfully.
Oct 06 14:30:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d-userdata-shm.mount: Deactivated successfully.
Oct 06 14:30:28 compute-0 podman[228357]: 2025-10-06 14:30:28.173883833 +0000 UTC m=+0.074536574 container cleanup 1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 06 14:30:28 compute-0 systemd[1]: libpod-conmon-1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d.scope: Deactivated successfully.
Oct 06 14:30:28 compute-0 podman[228361]: 2025-10-06 14:30:28.204085711 +0000 UTC m=+0.079756351 container remove 1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.210 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd60ebb-0ff1-499a-8781-7e5adb5ab076]: (4, ("Mon Oct  6 02:30:28 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac (1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d)\n1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d\nMon Oct  6 02:30:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac (1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d)\n1f2635ac33d339ab1413a054e624be04236d6cf4317d69f33521d06d0491d72d\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.212 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9331224d-7db4-4306-bbe9-e82641f475cc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.213 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0424b4b-abad-433f-a1fd-549d3e8c60ac.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.213 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[5df9649e-95ee-47c7-8740-2e11055bad63]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.214 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0424b4b-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 kernel: tapa0424b4b-a0: left promiscuous mode
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.233 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9134385b-14d5-45be-afef-f50575f04508]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.268 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[755326c5-023a-4b7a-b674-d3d3d9315665]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.270 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[824359d5-3f8a-41e2-b9dc-fe0849a73e3d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.284 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6e63e5f7-eead-498a-a727-f3ad55592a46]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540967, 'reachable_time': 29315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228406, 'error': None, 'target': 'ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.288 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a0424b4b-abad-433f-a1fd-549d3e8c60ac deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:30:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:28.288 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[c65785dc-0cd4-4403-975a-a8e213944ed8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:28 compute-0 systemd[1]: run-netns-ovnmeta\x2da0424b4b\x2dabad\x2d433f\x2da1fd\x2d549d3e8c60ac.mount: Deactivated successfully.
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.652 2 DEBUG nova.virt.libvirt.vif [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:28:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2144595250',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2144595250',id=28,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:28:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d54ae6464194adf9e6c766021f7d34d',ramdisk_id='',reservation_id='r-77of7lm2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064487469-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:30:08Z,user_data=None,user_id='056f3d4527be4c01acce85b1b5641775',uuid=90c83b67-c2b6-49d6-a2e5-f025b87cd378,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c91af92-7b04-42db-8657-8d0776baacb5", "address": "fa:16:3e:b7:23:ac", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c91af92-7b", "ovs_interfaceid": "0c91af92-7b04-42db-8657-8d0776baacb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.653 2 DEBUG nova.network.os_vif_util [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Converting VIF {"id": "0c91af92-7b04-42db-8657-8d0776baacb5", "address": "fa:16:3e:b7:23:ac", "network": {"id": "a0424b4b-abad-433f-a1fd-549d3e8c60ac", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2137262730-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "84027c3a5bc24322a774ec81d91af7d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c91af92-7b", "ovs_interfaceid": "0c91af92-7b04-42db-8657-8d0776baacb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.654 2 DEBUG nova.network.os_vif_util [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:23:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c91af92-7b04-42db-8657-8d0776baacb5,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c91af92-7b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.655 2 DEBUG os_vif [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:23:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c91af92-7b04-42db-8657-8d0776baacb5,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c91af92-7b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.658 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c91af92-7b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.665 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=515e112b-0858-41ff-b3ea-4d8fa94f4406) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.668 2 INFO os_vif [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:23:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c91af92-7b04-42db-8657-8d0776baacb5,network=Network(a0424b4b-abad-433f-a1fd-549d3e8c60ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c91af92-7b')
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.669 2 INFO nova.virt.libvirt.driver [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Deleting instance files /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378_del
Oct 06 14:30:28 compute-0 nova_compute[192903]: 2025-10-06 14:30:28.669 2 INFO nova.virt.libvirt.driver [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Deletion of /var/lib/nova/instances/90c83b67-c2b6-49d6-a2e5-f025b87cd378_del complete
Oct 06 14:30:29 compute-0 podman[203308]: time="2025-10-06T14:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:30:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:30:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 06 14:30:29 compute-0 nova_compute[192903]: 2025-10-06 14:30:29.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:31 compute-0 podman[228409]: 2025-10-06 14:30:31.211787496 +0000 UTC m=+0.068700840 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:30:31 compute-0 podman[228410]: 2025-10-06 14:30:31.218948177 +0000 UTC m=+0.072165757 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:30:31 compute-0 podman[228408]: 2025-10-06 14:30:31.224595336 +0000 UTC m=+0.076672314 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:30:31 compute-0 podman[228407]: 2025-10-06 14:30:31.263388285 +0000 UTC m=+0.118882429 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=watcher_latest)
Oct 06 14:30:31 compute-0 openstack_network_exporter[205500]: ERROR   14:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:30:31 compute-0 openstack_network_exporter[205500]: ERROR   14:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:30:31 compute-0 openstack_network_exporter[205500]: ERROR   14:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:30:31 compute-0 openstack_network_exporter[205500]: ERROR   14:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:30:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:30:31 compute-0 openstack_network_exporter[205500]: ERROR   14:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:30:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:30:31 compute-0 nova_compute[192903]: 2025-10-06 14:30:31.442 2 DEBUG nova.compute.manager [req-1afe451b-fa78-4837-a36b-77cb425239bc req-cf329e9f-da72-4e79-bd10-d6f84f673ff2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Received event network-vif-unplugged-0c91af92-7b04-42db-8657-8d0776baacb5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:30:31 compute-0 nova_compute[192903]: 2025-10-06 14:30:31.442 2 DEBUG oslo_concurrency.lockutils [req-1afe451b-fa78-4837-a36b-77cb425239bc req-cf329e9f-da72-4e79-bd10-d6f84f673ff2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:31 compute-0 nova_compute[192903]: 2025-10-06 14:30:31.442 2 DEBUG oslo_concurrency.lockutils [req-1afe451b-fa78-4837-a36b-77cb425239bc req-cf329e9f-da72-4e79-bd10-d6f84f673ff2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:31 compute-0 nova_compute[192903]: 2025-10-06 14:30:31.443 2 DEBUG oslo_concurrency.lockutils [req-1afe451b-fa78-4837-a36b-77cb425239bc req-cf329e9f-da72-4e79-bd10-d6f84f673ff2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:31 compute-0 nova_compute[192903]: 2025-10-06 14:30:31.443 2 DEBUG nova.compute.manager [req-1afe451b-fa78-4837-a36b-77cb425239bc req-cf329e9f-da72-4e79-bd10-d6f84f673ff2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] No waiting events found dispatching network-vif-unplugged-0c91af92-7b04-42db-8657-8d0776baacb5 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:30:31 compute-0 nova_compute[192903]: 2025-10-06 14:30:31.443 2 DEBUG nova.compute.manager [req-1afe451b-fa78-4837-a36b-77cb425239bc req-cf329e9f-da72-4e79-bd10-d6f84f673ff2 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Received event network-vif-unplugged-0c91af92-7b04-42db-8657-8d0776baacb5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:30:32 compute-0 nova_compute[192903]: 2025-10-06 14:30:32.408 2 INFO nova.compute.manager [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Took 4.55 seconds to destroy the instance on the hypervisor.
Oct 06 14:30:32 compute-0 nova_compute[192903]: 2025-10-06 14:30:32.409 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:30:32 compute-0 nova_compute[192903]: 2025-10-06 14:30:32.409 2 DEBUG nova.compute.manager [-] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:30:32 compute-0 nova_compute[192903]: 2025-10-06 14:30:32.409 2 DEBUG nova.network.neutron [-] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:30:32 compute-0 nova_compute[192903]: 2025-10-06 14:30:32.409 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:32 compute-0 nova_compute[192903]: 2025-10-06 14:30:32.603 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:30:32 compute-0 nova_compute[192903]: 2025-10-06 14:30:32.961 2 DEBUG nova.compute.manager [req-6188fc48-9f76-44b6-a1e1-8c7e29c2bcfb req-f15d8024-01a0-482d-b120-0be21491a871 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Received event network-vif-deleted-0c91af92-7b04-42db-8657-8d0776baacb5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:30:32 compute-0 nova_compute[192903]: 2025-10-06 14:30:32.961 2 INFO nova.compute.manager [req-6188fc48-9f76-44b6-a1e1-8c7e29c2bcfb req-f15d8024-01a0-482d-b120-0be21491a871 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Neutron deleted interface 0c91af92-7b04-42db-8657-8d0776baacb5; detaching it from the instance and deleting it from the info cache
Oct 06 14:30:32 compute-0 nova_compute[192903]: 2025-10-06 14:30:32.961 2 DEBUG nova.network.neutron [req-6188fc48-9f76-44b6-a1e1-8c7e29c2bcfb req-f15d8024-01a0-482d-b120-0be21491a871 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:30:33 compute-0 nova_compute[192903]: 2025-10-06 14:30:33.401 2 DEBUG nova.network.neutron [-] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:30:33 compute-0 nova_compute[192903]: 2025-10-06 14:30:33.470 2 DEBUG nova.compute.manager [req-6188fc48-9f76-44b6-a1e1-8c7e29c2bcfb req-f15d8024-01a0-482d-b120-0be21491a871 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Detach interface failed, port_id=0c91af92-7b04-42db-8657-8d0776baacb5, reason: Instance 90c83b67-c2b6-49d6-a2e5-f025b87cd378 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:30:33 compute-0 nova_compute[192903]: 2025-10-06 14:30:33.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:33 compute-0 nova_compute[192903]: 2025-10-06 14:30:33.910 2 INFO nova.compute.manager [-] [instance: 90c83b67-c2b6-49d6-a2e5-f025b87cd378] Took 1.50 seconds to deallocate network for instance.
Oct 06 14:30:34 compute-0 nova_compute[192903]: 2025-10-06 14:30:34.437 2 DEBUG oslo_concurrency.lockutils [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:30:34 compute-0 nova_compute[192903]: 2025-10-06 14:30:34.438 2 DEBUG oslo_concurrency.lockutils [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:30:34 compute-0 nova_compute[192903]: 2025-10-06 14:30:34.446 2 DEBUG oslo_concurrency.lockutils [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:34 compute-0 nova_compute[192903]: 2025-10-06 14:30:34.480 2 INFO nova.scheduler.client.report [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Deleted allocations for instance 90c83b67-c2b6-49d6-a2e5-f025b87cd378
Oct 06 14:30:34 compute-0 nova_compute[192903]: 2025-10-06 14:30:34.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:35 compute-0 nova_compute[192903]: 2025-10-06 14:30:35.514 2 DEBUG oslo_concurrency.lockutils [None req-0f07db8d-a99b-4003-b00b-d2877476272e 056f3d4527be4c01acce85b1b5641775 3d54ae6464194adf9e6c766021f7d34d - - default default] Lock "90c83b67-c2b6-49d6-a2e5-f025b87cd378" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.716s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:30:38 compute-0 nova_compute[192903]: 2025-10-06 14:30:38.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:39 compute-0 nova_compute[192903]: 2025-10-06 14:30:39.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:41 compute-0 podman[228495]: 2025-10-06 14:30:41.212123778 +0000 UTC m=+0.069737989 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4)
Oct 06 14:30:43 compute-0 nova_compute[192903]: 2025-10-06 14:30:43.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:43 compute-0 nova_compute[192903]: 2025-10-06 14:30:43.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:44 compute-0 nova_compute[192903]: 2025-10-06 14:30:44.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:46 compute-0 podman[228515]: 2025-10-06 14:30:46.202632798 +0000 UTC m=+0.065509952 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git)
Oct 06 14:30:48 compute-0 nova_compute[192903]: 2025-10-06 14:30:48.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:49 compute-0 nova_compute[192903]: 2025-10-06 14:30:49.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:53 compute-0 nova_compute[192903]: 2025-10-06 14:30:53.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:54 compute-0 nova_compute[192903]: 2025-10-06 14:30:54.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:58 compute-0 nova_compute[192903]: 2025-10-06 14:30:58.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:30:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:58.683 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:6a:05 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3486723-d121-43c7-9194-63860e513b31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bb0ecd786c974c4e9468e41534d63909', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab97e14a-5383-4896-8ec6-53d938fe85c4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1ac3ed16-bb39-4937-aa85-df05f50a260e) old=Port_Binding(mac=['fa:16:3e:00:6a:05'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3486723-d121-43c7-9194-63860e513b31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bb0ecd786c974c4e9468e41534d63909', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:30:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:58.684 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1ac3ed16-bb39-4937-aa85-df05f50a260e in datapath e3486723-d121-43c7-9194-63860e513b31 updated
Oct 06 14:30:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:58.685 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3486723-d121-43c7-9194-63860e513b31, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:30:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:30:58.687 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ec1dcf11-6b93-42b7-b4a0-50d023300d29]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:30:59 compute-0 podman[203308]: time="2025-10-06T14:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:30:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:30:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 06 14:30:59 compute-0 nova_compute[192903]: 2025-10-06 14:30:59.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:01 compute-0 openstack_network_exporter[205500]: ERROR   14:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:31:01 compute-0 openstack_network_exporter[205500]: ERROR   14:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:31:01 compute-0 openstack_network_exporter[205500]: ERROR   14:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:31:01 compute-0 openstack_network_exporter[205500]: ERROR   14:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:31:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:31:01 compute-0 openstack_network_exporter[205500]: ERROR   14:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:31:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:31:02 compute-0 podman[228540]: 2025-10-06 14:31:02.228168758 +0000 UTC m=+0.080082500 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 06 14:31:02 compute-0 podman[228539]: 2025-10-06 14:31:02.247665505 +0000 UTC m=+0.093206238 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 06 14:31:02 compute-0 podman[228546]: 2025-10-06 14:31:02.248077517 +0000 UTC m=+0.082438966 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:31:02 compute-0 podman[228538]: 2025-10-06 14:31:02.26387042 +0000 UTC m=+0.125656679 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 06 14:31:03 compute-0 nova_compute[192903]: 2025-10-06 14:31:03.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:31:03 compute-0 nova_compute[192903]: 2025-10-06 14:31:03.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:04 compute-0 nova_compute[192903]: 2025-10-06 14:31:04.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:05 compute-0 nova_compute[192903]: 2025-10-06 14:31:05.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:31:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:05.670 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:31:05 compute-0 nova_compute[192903]: 2025-10-06 14:31:05.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:05 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:05.671 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.103 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.103 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.103 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.103 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.258 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.259 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.299 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.300 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5818MB free_disk=73.29997634887695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.300 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:31:06 compute-0 nova_compute[192903]: 2025-10-06 14:31:06.300 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:31:07 compute-0 nova_compute[192903]: 2025-10-06 14:31:07.348 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:31:07 compute-0 nova_compute[192903]: 2025-10-06 14:31:07.349 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:31:06 up  1:32,  0 user,  load average: 0.15, 0.19, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:31:07 compute-0 nova_compute[192903]: 2025-10-06 14:31:07.518 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:31:08 compute-0 nova_compute[192903]: 2025-10-06 14:31:08.026 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:31:08 compute-0 nova_compute[192903]: 2025-10-06 14:31:08.535 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:31:08 compute-0 nova_compute[192903]: 2025-10-06 14:31:08.535 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.235s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:31:08 compute-0 nova_compute[192903]: 2025-10-06 14:31:08.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:08.941 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:0d:89 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ed24bbe2-775f-45e8-8991-c96c17ff11e3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed24bbe2-775f-45e8-8991-c96c17ff11e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0ec553506daf47cdb68df0c86c52faef', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=98c9a01f-eaaf-4720-8685-9dc35b47c38a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cf2b071e-299b-45e4-b609-a6f166c97d91) old=Port_Binding(mac=['fa:16:3e:23:0d:89'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ed24bbe2-775f-45e8-8991-c96c17ff11e3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed24bbe2-775f-45e8-8991-c96c17ff11e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0ec553506daf47cdb68df0c86c52faef', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:31:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:08.943 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cf2b071e-299b-45e4-b609-a6f166c97d91 in datapath ed24bbe2-775f-45e8-8991-c96c17ff11e3 updated
Oct 06 14:31:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:08.943 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed24bbe2-775f-45e8-8991-c96c17ff11e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:31:08 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:08.944 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8cbd76ed-2e5c-48f1-b013-d17f8dc907d3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:31:09 compute-0 nova_compute[192903]: 2025-10-06 14:31:09.531 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:31:09 compute-0 nova_compute[192903]: 2025-10-06 14:31:09.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:11.410 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:31:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:11.410 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:31:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:11.411 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:31:12 compute-0 podman[228629]: 2025-10-06 14:31:12.244893611 +0000 UTC m=+0.102937352 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 06 14:31:12 compute-0 nova_compute[192903]: 2025-10-06 14:31:12.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:31:12 compute-0 nova_compute[192903]: 2025-10-06 14:31:12.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:31:12 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:31:12.672 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:31:13 compute-0 nova_compute[192903]: 2025-10-06 14:31:13.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:14 compute-0 nova_compute[192903]: 2025-10-06 14:31:14.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:31:14 compute-0 nova_compute[192903]: 2025-10-06 14:31:14.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:15 compute-0 nova_compute[192903]: 2025-10-06 14:31:15.086 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:31:17 compute-0 podman[228649]: 2025-10-06 14:31:17.197903558 +0000 UTC m=+0.064476681 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 06 14:31:17 compute-0 nova_compute[192903]: 2025-10-06 14:31:17.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:31:18 compute-0 nova_compute[192903]: 2025-10-06 14:31:18.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:19 compute-0 ovn_controller[95205]: 2025-10-06T14:31:19Z|00278|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct 06 14:31:19 compute-0 nova_compute[192903]: 2025-10-06 14:31:19.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:20 compute-0 nova_compute[192903]: 2025-10-06 14:31:20.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:31:22 compute-0 nova_compute[192903]: 2025-10-06 14:31:22.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:31:23 compute-0 nova_compute[192903]: 2025-10-06 14:31:23.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:24 compute-0 nova_compute[192903]: 2025-10-06 14:31:24.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:28 compute-0 nova_compute[192903]: 2025-10-06 14:31:28.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:29 compute-0 podman[203308]: time="2025-10-06T14:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:31:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:31:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 06 14:31:29 compute-0 nova_compute[192903]: 2025-10-06 14:31:29.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:31 compute-0 openstack_network_exporter[205500]: ERROR   14:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:31:31 compute-0 openstack_network_exporter[205500]: ERROR   14:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:31:31 compute-0 openstack_network_exporter[205500]: ERROR   14:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:31:31 compute-0 openstack_network_exporter[205500]: ERROR   14:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:31:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:31:31 compute-0 openstack_network_exporter[205500]: ERROR   14:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:31:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:31:33 compute-0 podman[228672]: 2025-10-06 14:31:33.247919601 +0000 UTC m=+0.094463633 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 14:31:33 compute-0 podman[228676]: 2025-10-06 14:31:33.256702528 +0000 UTC m=+0.084389131 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:31:33 compute-0 podman[228673]: 2025-10-06 14:31:33.260722991 +0000 UTC m=+0.097288883 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 06 14:31:33 compute-0 podman[228671]: 2025-10-06 14:31:33.272604004 +0000 UTC m=+0.124933779 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:31:33 compute-0 nova_compute[192903]: 2025-10-06 14:31:33.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:34 compute-0 nova_compute[192903]: 2025-10-06 14:31:34.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:38 compute-0 nova_compute[192903]: 2025-10-06 14:31:38.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:39 compute-0 nova_compute[192903]: 2025-10-06 14:31:39.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:43 compute-0 podman[228753]: 2025-10-06 14:31:43.223318074 +0000 UTC m=+0.082197679 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 14:31:43 compute-0 nova_compute[192903]: 2025-10-06 14:31:43.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:44 compute-0 nova_compute[192903]: 2025-10-06 14:31:44.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:48 compute-0 podman[228774]: 2025-10-06 14:31:48.259716585 +0000 UTC m=+0.116271336 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, name=ubi9-minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Oct 06 14:31:48 compute-0 nova_compute[192903]: 2025-10-06 14:31:48.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:49 compute-0 nova_compute[192903]: 2025-10-06 14:31:49.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:53 compute-0 nova_compute[192903]: 2025-10-06 14:31:53.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:54 compute-0 nova_compute[192903]: 2025-10-06 14:31:54.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:58 compute-0 nova_compute[192903]: 2025-10-06 14:31:58.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:59 compute-0 podman[203308]: time="2025-10-06T14:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:31:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:31:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 06 14:31:59 compute-0 nova_compute[192903]: 2025-10-06 14:31:59.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:31:59 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 06 14:32:01 compute-0 openstack_network_exporter[205500]: ERROR   14:32:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:32:01 compute-0 openstack_network_exporter[205500]: ERROR   14:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:32:01 compute-0 openstack_network_exporter[205500]: ERROR   14:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:32:01 compute-0 openstack_network_exporter[205500]: ERROR   14:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:32:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:32:01 compute-0 openstack_network_exporter[205500]: ERROR   14:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:32:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:32:03 compute-0 nova_compute[192903]: 2025-10-06 14:32:03.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:32:03 compute-0 nova_compute[192903]: 2025-10-06 14:32:03.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:03 compute-0 podman[228800]: 2025-10-06 14:32:03.854893749 +0000 UTC m=+0.080008918 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:32:03 compute-0 podman[228806]: 2025-10-06 14:32:03.855805584 +0000 UTC m=+0.079556885 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:32:03 compute-0 podman[228799]: 2025-10-06 14:32:03.859138528 +0000 UTC m=+0.088265780 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct 06 14:32:03 compute-0 podman[228798]: 2025-10-06 14:32:03.893076271 +0000 UTC m=+0.132364718 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 06 14:32:04 compute-0 nova_compute[192903]: 2025-10-06 14:32:04.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:05 compute-0 nova_compute[192903]: 2025-10-06 14:32:05.795 2 DEBUG nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Creating tmpfile /var/lib/nova/instances/tmpvj49ygrf to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:32:05 compute-0 nova_compute[192903]: 2025-10-06 14:32:05.797 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:05 compute-0 nova_compute[192903]: 2025-10-06 14:32:05.800 2 DEBUG nova.compute.manager [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvj49ygrf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:32:06 compute-0 nova_compute[192903]: 2025-10-06 14:32:06.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.098 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.294 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.296 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.342 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.344 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5819MB free_disk=73.29997634887695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.344 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.345 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:07 compute-0 nova_compute[192903]: 2025-10-06 14:32:07.849 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:08 compute-0 nova_compute[192903]: 2025-10-06 14:32:08.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:08 compute-0 nova_compute[192903]: 2025-10-06 14:32:08.893 2 WARNING nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 8e2f40eb-8639-4ced-aea4-ada2fda296e6 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.
Oct 06 14:32:08 compute-0 nova_compute[192903]: 2025-10-06 14:32:08.894 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:32:08 compute-0 nova_compute[192903]: 2025-10-06 14:32:08.894 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:32:07 up  1:33,  0 user,  load average: 0.05, 0.15, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:32:08 compute-0 nova_compute[192903]: 2025-10-06 14:32:08.973 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:32:09 compute-0 nova_compute[192903]: 2025-10-06 14:32:09.481 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:32:09 compute-0 nova_compute[192903]: 2025-10-06 14:32:09.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:09 compute-0 nova_compute[192903]: 2025-10-06 14:32:09.992 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:32:09 compute-0 nova_compute[192903]: 2025-10-06 14:32:09.993 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.648s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:10 compute-0 nova_compute[192903]: 2025-10-06 14:32:10.989 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:32:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:11.412 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:11.412 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:11.413 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:12 compute-0 nova_compute[192903]: 2025-10-06 14:32:12.465 2 DEBUG nova.compute.manager [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvj49ygrf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e2f40eb-8639-4ced-aea4-ada2fda296e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:32:13 compute-0 nova_compute[192903]: 2025-10-06 14:32:13.482 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-8e2f40eb-8639-4ced-aea4-ada2fda296e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:32:13 compute-0 nova_compute[192903]: 2025-10-06 14:32:13.482 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-8e2f40eb-8639-4ced-aea4-ada2fda296e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:32:13 compute-0 nova_compute[192903]: 2025-10-06 14:32:13.483 2 DEBUG nova.network.neutron [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:32:13 compute-0 nova_compute[192903]: 2025-10-06 14:32:13.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:13 compute-0 nova_compute[192903]: 2025-10-06 14:32:13.990 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:14 compute-0 podman[228888]: 2025-10-06 14:32:14.194973452 +0000 UTC m=+0.058599017 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930)
Oct 06 14:32:14 compute-0 nova_compute[192903]: 2025-10-06 14:32:14.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:32:14 compute-0 nova_compute[192903]: 2025-10-06 14:32:14.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:32:14 compute-0 nova_compute[192903]: 2025-10-06 14:32:14.697 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:14 compute-0 nova_compute[192903]: 2025-10-06 14:32:14.885 2 DEBUG nova.network.neutron [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Updating instance_info_cache with network_info: [{"id": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "address": "fa:16:3e:d5:c3:ec", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf061a3e9-a2", "ovs_interfaceid": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:32:14 compute-0 nova_compute[192903]: 2025-10-06 14:32:14.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:15 compute-0 nova_compute[192903]: 2025-10-06 14:32:15.392 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-8e2f40eb-8639-4ced-aea4-ada2fda296e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:32:15 compute-0 nova_compute[192903]: 2025-10-06 14:32:15.408 2 DEBUG nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvj49ygrf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e2f40eb-8639-4ced-aea4-ada2fda296e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:32:15 compute-0 nova_compute[192903]: 2025-10-06 14:32:15.409 2 DEBUG nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Creating instance directory: /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:32:15 compute-0 nova_compute[192903]: 2025-10-06 14:32:15.409 2 DEBUG nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Creating disk.info with the contents: {'/var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk': 'qcow2', '/var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:32:15 compute-0 nova_compute[192903]: 2025-10-06 14:32:15.410 2 DEBUG nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:32:15 compute-0 nova_compute[192903]: 2025-10-06 14:32:15.410 2 DEBUG nova.objects.instance [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8e2f40eb-8639-4ced-aea4-ada2fda296e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:32:15 compute-0 nova_compute[192903]: 2025-10-06 14:32:15.917 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:32:15 compute-0 nova_compute[192903]: 2025-10-06 14:32:15.920 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:32:15 compute-0 nova_compute[192903]: 2025-10-06 14:32:15.921 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.011 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.013 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.014 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.015 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.022 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.023 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.099 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.100 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.139 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.140 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.141 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.226 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.227 2 DEBUG nova.virt.disk.api [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.228 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.313 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.314 2 DEBUG nova.virt.disk.api [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.314 2 DEBUG nova.objects.instance [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid 8e2f40eb-8639-4ced-aea4-ada2fda296e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.821 2 DEBUG nova.objects.base [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<8e2f40eb-8639-4ced-aea4-ada2fda296e6> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.822 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.855 2 DEBUG oslo_concurrency.processutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6/disk.config 497664" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.856 2 DEBUG nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.858 2 DEBUG nova.virt.libvirt.vif [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1721708052',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1721708052',id=30,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:31:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0ec553506daf47cdb68df0c86c52faef',ramdisk_id='',reservation_id='r-t0asq9t7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1983018748',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1983018748-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:31:32Z,user_data=None,user_id='4d0489ee5b894f5e87df2cef154bcd29',uuid=8e2f40eb-8639-4ced-aea4-ada2fda296e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "address": "fa:16:3e:d5:c3:ec", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf061a3e9-a2", "ovs_interfaceid": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.859 2 DEBUG nova.network.os_vif_util [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "address": "fa:16:3e:d5:c3:ec", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf061a3e9-a2", "ovs_interfaceid": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.860 2 DEBUG nova.network.os_vif_util [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=f061a3e9-a2f9-4815-92f6-a497f5c756b8,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf061a3e9-a2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.860 2 DEBUG os_vif [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=f061a3e9-a2f9-4815-92f6-a497f5c756b8,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf061a3e9-a2') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.862 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.862 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.864 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1ede1569-057a-57de-88a7-307deef80958', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.872 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf061a3e9-a2, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.873 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf061a3e9-a2, col_values=(('qos', UUID('07a6ed75-8af5-4954-9a40-aad180dcd645')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.874 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf061a3e9-a2, col_values=(('external_ids', {'iface-id': 'f061a3e9-a2f9-4815-92f6-a497f5c756b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:c3:ec', 'vm-uuid': '8e2f40eb-8639-4ced-aea4-ada2fda296e6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:16 compute-0 NetworkManager[52035]: <info>  [1759761136.8763] manager: (tapf061a3e9-a2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.886 2 INFO os_vif [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=f061a3e9-a2f9-4815-92f6-a497f5c756b8,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf061a3e9-a2')
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.886 2 DEBUG nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.887 2 DEBUG nova.compute.manager [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvj49ygrf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e2f40eb-8639-4ced-aea4-ada2fda296e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.888 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:16 compute-0 nova_compute[192903]: 2025-10-06 14:32:16.999 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:17 compute-0 nova_compute[192903]: 2025-10-06 14:32:17.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:17.413 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:32:17 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:17.414 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:32:17 compute-0 nova_compute[192903]: 2025-10-06 14:32:17.750 2 DEBUG nova.network.neutron [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Port f061a3e9-a2f9-4815-92f6-a497f5c756b8 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:32:17 compute-0 nova_compute[192903]: 2025-10-06 14:32:17.763 2 DEBUG nova.compute.manager [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvj49ygrf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e2f40eb-8639-4ced-aea4-ada2fda296e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:32:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:18.415 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:19 compute-0 podman[228929]: 2025-10-06 14:32:19.219412964 +0000 UTC m=+0.086640704 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Oct 06 14:32:19 compute-0 nova_compute[192903]: 2025-10-06 14:32:19.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:32:19 compute-0 nova_compute[192903]: 2025-10-06 14:32:19.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:20 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 06 14:32:20 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 06 14:32:21 compute-0 kernel: tapf061a3e9-a2: entered promiscuous mode
Oct 06 14:32:21 compute-0 NetworkManager[52035]: <info>  [1759761141.0991] manager: (tapf061a3e9-a2): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 ovn_controller[95205]: 2025-10-06T14:32:21Z|00279|binding|INFO|Claiming lport f061a3e9-a2f9-4815-92f6-a497f5c756b8 for this additional chassis.
Oct 06 14:32:21 compute-0 ovn_controller[95205]: 2025-10-06T14:32:21Z|00280|binding|INFO|f061a3e9-a2f9-4815-92f6-a497f5c756b8: Claiming fa:16:3e:d5:c3:ec 10.100.0.13
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.133 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:c3:ec 10.100.0.13'], port_security=['fa:16:3e:d5:c3:ec 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8e2f40eb-8639-4ced-aea4-ada2fda296e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3486723-d121-43c7-9194-63860e513b31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0ec553506daf47cdb68df0c86c52faef', 'neutron:revision_number': '10', 'neutron:security_group_ids': '992e6d44-ac44-42a2-98c5-27de37a3d90b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab97e14a-5383-4896-8ec6-53d938fe85c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=f061a3e9-a2f9-4815-92f6-a497f5c756b8) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.135 104072 INFO neutron.agent.ovn.metadata.agent [-] Port f061a3e9-a2f9-4815-92f6-a497f5c756b8 in datapath e3486723-d121-43c7-9194-63860e513b31 unbound from our chassis
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.136 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3486723-d121-43c7-9194-63860e513b31
Oct 06 14:32:21 compute-0 systemd-udevd[228980]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.158 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[f615431b-f77c-45cd-b8e2-96c6a569de8e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.159 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape3486723-d1 in ovnmeta-e3486723-d121-43c7-9194-63860e513b31 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.160 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape3486723-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.161 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c49a39e8-66f1-4296-a0e0-a6f48ff14f17]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.161 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7040e9a5-e49a-4c3c-83d3-115f90e133b9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 NetworkManager[52035]: <info>  [1759761141.1665] device (tapf061a3e9-a2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:32:21 compute-0 NetworkManager[52035]: <info>  [1759761141.1672] device (tapf061a3e9-a2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.179 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[73fba371-4ea9-4173-a137-ba2702ce50ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 systemd-machined[152985]: New machine qemu-25-instance-0000001e.
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 ovn_controller[95205]: 2025-10-06T14:32:21Z|00281|binding|INFO|Setting lport f061a3e9-a2f9-4815-92f6-a497f5c756b8 ovn-installed in OVS
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-0000001e.
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.202 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[01acc0da-3c5f-413e-950e-0e498871375c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.240 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[3594d6f1-7c75-455a-8366-2607d7ca695f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.246 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[27fcb6d7-a7e2-4865-8749-c30b59af5b71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 NetworkManager[52035]: <info>  [1759761141.2473] manager: (tape3486723-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.279 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[7d44f100-3d71-416a-8994-c15908a2f5d5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.282 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[1af3d5b1-923c-4307-8444-6aa0942dd871]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 NetworkManager[52035]: <info>  [1759761141.3057] device (tape3486723-d0): carrier: link connected
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.314 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[679b0eb7-0000-4729-93e5-696fb35b07b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.344 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8afae9-f346-47be-83de-90df56e6d347]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3486723-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:6a:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560294, 'reachable_time': 31285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229015, 'error': None, 'target': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.366 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a93e3014-1301-4c11-98d7-57210a6598da]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:6a05'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560294, 'tstamp': 560294}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229016, 'error': None, 'target': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.391 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[d515bc81-3201-4945-a48d-a989b152e728]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3486723-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:6a:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560294, 'reachable_time': 31285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229017, 'error': None, 'target': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.435 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[44eee02c-8248-4b4a-bb9c-46fb4b59290a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.529 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7518c7ed-0ae4-43f8-9066-5997f8520238]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.531 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3486723-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.531 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.531 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3486723-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 kernel: tape3486723-d0: entered promiscuous mode
Oct 06 14:32:21 compute-0 NetworkManager[52035]: <info>  [1759761141.5692] manager: (tape3486723-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.572 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3486723-d0, col_values=(('external_ids', {'iface-id': '1ac3ed16-bb39-4937-aa85-df05f50a260e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 ovn_controller[95205]: 2025-10-06T14:32:21Z|00282|binding|INFO|Releasing lport 1ac3ed16-bb39-4937-aa85-df05f50a260e from this chassis (sb_readonly=0)
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.576 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b7fb6911-5b8f-4b45-8c66-0c53caf41980]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.584 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.584 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.585 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for e3486723-d121-43c7-9194-63860e513b31 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.585 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.585 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2390bce3-7389-425e-bdbe-b7daecfa5910]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.586 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.586 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf6555b-d0ae-4b07-a612-4ed54d22a442]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.587 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-e3486723-d121-43c7-9194-63860e513b31
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID e3486723-d121-43c7-9194-63860e513b31
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:32:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:21.589 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'env', 'PROCESS_TAG=haproxy-e3486723-d121-43c7-9194-63860e513b31', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e3486723-d121-43c7-9194-63860e513b31.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:32:21 compute-0 nova_compute[192903]: 2025-10-06 14:32:21.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:21 compute-0 podman[229047]: 2025-10-06 14:32:21.954083823 +0000 UTC m=+0.055579352 container create d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 06 14:32:22 compute-0 systemd[1]: Started libpod-conmon-d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307.scope.
Oct 06 14:32:22 compute-0 podman[229047]: 2025-10-06 14:32:21.920984713 +0000 UTC m=+0.022480272 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:32:22 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:32:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/130d424781feb938db98da4cc43b71e9e34d4be8c2b272c90b6a3db15ca267b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:32:22 compute-0 podman[229047]: 2025-10-06 14:32:22.05871098 +0000 UTC m=+0.160206569 container init d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 06 14:32:22 compute-0 podman[229047]: 2025-10-06 14:32:22.069440322 +0000 UTC m=+0.170935871 container start d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 06 14:32:22 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229062]: [NOTICE]   (229066) : New worker (229068) forked
Oct 06 14:32:22 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229062]: [NOTICE]   (229066) : Loading success.
Oct 06 14:32:22 compute-0 nova_compute[192903]: 2025-10-06 14:32:22.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:32:24 compute-0 nova_compute[192903]: 2025-10-06 14:32:24.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:25 compute-0 ovn_controller[95205]: 2025-10-06T14:32:25Z|00283|binding|INFO|Claiming lport f061a3e9-a2f9-4815-92f6-a497f5c756b8 for this chassis.
Oct 06 14:32:25 compute-0 ovn_controller[95205]: 2025-10-06T14:32:25Z|00284|binding|INFO|f061a3e9-a2f9-4815-92f6-a497f5c756b8: Claiming fa:16:3e:d5:c3:ec 10.100.0.13
Oct 06 14:32:25 compute-0 ovn_controller[95205]: 2025-10-06T14:32:25Z|00285|binding|INFO|Setting lport f061a3e9-a2f9-4815-92f6-a497f5c756b8 up in Southbound
Oct 06 14:32:26 compute-0 nova_compute[192903]: 2025-10-06 14:32:26.451 2 INFO nova.compute.manager [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Post operation of migration started
Oct 06 14:32:26 compute-0 nova_compute[192903]: 2025-10-06 14:32:26.452 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:26 compute-0 nova_compute[192903]: 2025-10-06 14:32:26.648 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:26 compute-0 nova_compute[192903]: 2025-10-06 14:32:26.648 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:26 compute-0 nova_compute[192903]: 2025-10-06 14:32:26.723 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-8e2f40eb-8639-4ced-aea4-ada2fda296e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:32:26 compute-0 nova_compute[192903]: 2025-10-06 14:32:26.723 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-8e2f40eb-8639-4ced-aea4-ada2fda296e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:32:26 compute-0 nova_compute[192903]: 2025-10-06 14:32:26.724 2 DEBUG nova.network.neutron [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:32:26 compute-0 nova_compute[192903]: 2025-10-06 14:32:26.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:27 compute-0 nova_compute[192903]: 2025-10-06 14:32:27.258 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:27 compute-0 nova_compute[192903]: 2025-10-06 14:32:27.634 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:27 compute-0 nova_compute[192903]: 2025-10-06 14:32:27.803 2 DEBUG nova.network.neutron [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Updating instance_info_cache with network_info: [{"id": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "address": "fa:16:3e:d5:c3:ec", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf061a3e9-a2", "ovs_interfaceid": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:32:28 compute-0 nova_compute[192903]: 2025-10-06 14:32:28.309 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-8e2f40eb-8639-4ced-aea4-ada2fda296e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:32:28 compute-0 nova_compute[192903]: 2025-10-06 14:32:28.831 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:28 compute-0 nova_compute[192903]: 2025-10-06 14:32:28.831 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:28 compute-0 nova_compute[192903]: 2025-10-06 14:32:28.832 2 DEBUG oslo_concurrency.lockutils [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:28 compute-0 nova_compute[192903]: 2025-10-06 14:32:28.837 2 INFO nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:32:28 compute-0 virtqemud[192802]: Domain id=25 name='instance-0000001e' uuid=8e2f40eb-8639-4ced-aea4-ada2fda296e6 is tainted: custom-monitor
Oct 06 14:32:29 compute-0 podman[203308]: time="2025-10-06T14:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:32:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:32:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3477 "" "Go-http-client/1.1"
Oct 06 14:32:29 compute-0 nova_compute[192903]: 2025-10-06 14:32:29.847 2 INFO nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:32:29 compute-0 nova_compute[192903]: 2025-10-06 14:32:29.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:30 compute-0 nova_compute[192903]: 2025-10-06 14:32:30.854 2 INFO nova.virt.libvirt.driver [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:32:30 compute-0 nova_compute[192903]: 2025-10-06 14:32:30.860 2 DEBUG nova.compute.manager [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:32:31 compute-0 nova_compute[192903]: 2025-10-06 14:32:31.372 2 DEBUG nova.objects.instance [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:32:31 compute-0 openstack_network_exporter[205500]: ERROR   14:32:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:32:31 compute-0 openstack_network_exporter[205500]: ERROR   14:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:32:31 compute-0 openstack_network_exporter[205500]: ERROR   14:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:32:31 compute-0 openstack_network_exporter[205500]: ERROR   14:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:32:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:32:31 compute-0 openstack_network_exporter[205500]: ERROR   14:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:32:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:32:31 compute-0 nova_compute[192903]: 2025-10-06 14:32:31.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:32 compute-0 nova_compute[192903]: 2025-10-06 14:32:32.398 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:32 compute-0 nova_compute[192903]: 2025-10-06 14:32:32.646 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:32 compute-0 nova_compute[192903]: 2025-10-06 14:32:32.647 2 WARNING neutronclient.v2_0.client [None req-9d388d8d-13f4-4128-914a-8b2002a280fa f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:34 compute-0 podman[229100]: 2025-10-06 14:32:34.235817674 +0000 UTC m=+0.082818087 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Oct 06 14:32:34 compute-0 podman[229101]: 2025-10-06 14:32:34.242862862 +0000 UTC m=+0.084023851 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 06 14:32:34 compute-0 podman[229107]: 2025-10-06 14:32:34.261004001 +0000 UTC m=+0.097166719 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:32:34 compute-0 podman[229099]: 2025-10-06 14:32:34.261083913 +0000 UTC m=+0.123457477 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller)
Oct 06 14:32:34 compute-0 nova_compute[192903]: 2025-10-06 14:32:34.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:36 compute-0 nova_compute[192903]: 2025-10-06 14:32:36.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:39 compute-0 nova_compute[192903]: 2025-10-06 14:32:39.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:41 compute-0 nova_compute[192903]: 2025-10-06 14:32:41.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.358 2 DEBUG oslo_concurrency.lockutils [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Acquiring lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.358 2 DEBUG oslo_concurrency.lockutils [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.358 2 DEBUG oslo_concurrency.lockutils [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Acquiring lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.359 2 DEBUG oslo_concurrency.lockutils [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.359 2 DEBUG oslo_concurrency.lockutils [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.369 2 INFO nova.compute.manager [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Terminating instance
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.883 2 DEBUG nova.compute.manager [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:32:42 compute-0 kernel: tapf061a3e9-a2 (unregistering): left promiscuous mode
Oct 06 14:32:42 compute-0 NetworkManager[52035]: <info>  [1759761162.9094] device (tapf061a3e9-a2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:42 compute-0 ovn_controller[95205]: 2025-10-06T14:32:42Z|00286|binding|INFO|Releasing lport f061a3e9-a2f9-4815-92f6-a497f5c756b8 from this chassis (sb_readonly=0)
Oct 06 14:32:42 compute-0 ovn_controller[95205]: 2025-10-06T14:32:42Z|00287|binding|INFO|Setting lport f061a3e9-a2f9-4815-92f6-a497f5c756b8 down in Southbound
Oct 06 14:32:42 compute-0 ovn_controller[95205]: 2025-10-06T14:32:42Z|00288|binding|INFO|Removing iface tapf061a3e9-a2 ovn-installed in OVS
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:42 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:42.930 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:c3:ec 10.100.0.13'], port_security=['fa:16:3e:d5:c3:ec 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8e2f40eb-8639-4ced-aea4-ada2fda296e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3486723-d121-43c7-9194-63860e513b31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0ec553506daf47cdb68df0c86c52faef', 'neutron:revision_number': '16', 'neutron:security_group_ids': '992e6d44-ac44-42a2-98c5-27de37a3d90b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab97e14a-5383-4896-8ec6-53d938fe85c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=f061a3e9-a2f9-4815-92f6-a497f5c756b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:32:42 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:42.931 104072 INFO neutron.agent.ovn.metadata.agent [-] Port f061a3e9-a2f9-4815-92f6-a497f5c756b8 in datapath e3486723-d121-43c7-9194-63860e513b31 unbound from our chassis
Oct 06 14:32:42 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:42.932 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3486723-d121-43c7-9194-63860e513b31, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:32:42 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:42.933 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b2744ff6-8b2a-4ff2-8b01-44dfbf491ae0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:42 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:42.934 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e3486723-d121-43c7-9194-63860e513b31 namespace which is not needed anymore
Oct 06 14:32:42 compute-0 nova_compute[192903]: 2025-10-06 14:32:42.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:42 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Oct 06 14:32:43 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000001e.scope: Consumed 3.715s CPU time.
Oct 06 14:32:43 compute-0 systemd-machined[152985]: Machine qemu-25-instance-0000001e terminated.
Oct 06 14:32:43 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229062]: [NOTICE]   (229066) : haproxy version is 3.0.5-8e879a5
Oct 06 14:32:43 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229062]: [NOTICE]   (229066) : path to executable is /usr/sbin/haproxy
Oct 06 14:32:43 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229062]: [WARNING]  (229066) : Exiting Master process...
Oct 06 14:32:43 compute-0 podman[229210]: 2025-10-06 14:32:43.064320671 +0000 UTC m=+0.032810522 container kill d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.063 2 DEBUG nova.compute.manager [req-90eb8a10-28d0-4208-9271-8c14c165c8cc req-a9da91ce-28bc-479f-9d96-fe1d2944518b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Received event network-vif-unplugged-f061a3e9-a2f9-4815-92f6-a497f5c756b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.065 2 DEBUG oslo_concurrency.lockutils [req-90eb8a10-28d0-4208-9271-8c14c165c8cc req-a9da91ce-28bc-479f-9d96-fe1d2944518b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.065 2 DEBUG oslo_concurrency.lockutils [req-90eb8a10-28d0-4208-9271-8c14c165c8cc req-a9da91ce-28bc-479f-9d96-fe1d2944518b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:43 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229062]: [ALERT]    (229066) : Current worker (229068) exited with code 143 (Terminated)
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.066 2 DEBUG oslo_concurrency.lockutils [req-90eb8a10-28d0-4208-9271-8c14c165c8cc req-a9da91ce-28bc-479f-9d96-fe1d2944518b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:43 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229062]: [WARNING]  (229066) : All workers exited. Exiting... (0)
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.066 2 DEBUG nova.compute.manager [req-90eb8a10-28d0-4208-9271-8c14c165c8cc req-a9da91ce-28bc-479f-9d96-fe1d2944518b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] No waiting events found dispatching network-vif-unplugged-f061a3e9-a2f9-4815-92f6-a497f5c756b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.067 2 DEBUG nova.compute.manager [req-90eb8a10-28d0-4208-9271-8c14c165c8cc req-a9da91ce-28bc-479f-9d96-fe1d2944518b e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Received event network-vif-unplugged-f061a3e9-a2f9-4815-92f6-a497f5c756b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:32:43 compute-0 systemd[1]: libpod-d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307.scope: Deactivated successfully.
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 podman[229225]: 2025-10-06 14:32:43.115285232 +0000 UTC m=+0.030667462 container died d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307-userdata-shm.mount: Deactivated successfully.
Oct 06 14:32:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-130d424781feb938db98da4cc43b71e9e34d4be8c2b272c90b6a3db15ca267b3-merged.mount: Deactivated successfully.
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.157 2 INFO nova.virt.libvirt.driver [-] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Instance destroyed successfully.
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.157 2 DEBUG nova.objects.instance [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lazy-loading 'resources' on Instance uuid 8e2f40eb-8639-4ced-aea4-ada2fda296e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:32:43 compute-0 podman[229225]: 2025-10-06 14:32:43.179133535 +0000 UTC m=+0.094515715 container cleanup d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 14:32:43 compute-0 systemd[1]: libpod-conmon-d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307.scope: Deactivated successfully.
Oct 06 14:32:43 compute-0 podman[229227]: 2025-10-06 14:32:43.193978642 +0000 UTC m=+0.098389624 container remove d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.216 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[0df2defc-06cb-43d3-b9e3-0bbf562d778c]: (4, ("Mon Oct  6 02:32:43 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31 (d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307)\nd266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307\nMon Oct  6 02:32:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31 (d266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307)\nd266a958edcb4b67615d5d436d8be6376795f28fbba397e5dfa2dfec34045307\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.217 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8259b487-67d8-4560-83da-42535ca78670]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.217 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.218 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[bb31d5d5-d3af-47a8-bafc-6d2994e29e48]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.218 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3486723-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 kernel: tape3486723-d0: left promiscuous mode
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.236 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4312b13c-1686-44bb-964e-5a3694e02233]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.261 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e78e221d-c539-499c-9604-f194d6803926]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.262 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b20b75-551f-4fd9-9449-0ec7b3c98564]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.278 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1914bd31-c418-48dc-9f87-4ad6be71ed90]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560287, 'reachable_time': 31003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229274, 'error': None, 'target': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.279 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e3486723-d121-43c7-9194-63860e513b31 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:32:43 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:43.279 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab274a0-8aa7-42cb-9c22-0f95bdb44472]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:32:43 compute-0 systemd[1]: run-netns-ovnmeta\x2de3486723\x2dd121\x2d43c7\x2d9194\x2d63860e513b31.mount: Deactivated successfully.
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.665 2 DEBUG nova.virt.libvirt.vif [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:31:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1721708052',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1721708052',id=30,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:31:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0ec553506daf47cdb68df0c86c52faef',ramdisk_id='',reservation_id='r-t0asq9t7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1983018748',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1983018748-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:32:31Z,user_data=None,user_id='4d0489ee5b894f5e87df2cef154bcd29',uuid=8e2f40eb-8639-4ced-aea4-ada2fda296e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "address": "fa:16:3e:d5:c3:ec", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf061a3e9-a2", "ovs_interfaceid": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.667 2 DEBUG nova.network.os_vif_util [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Converting VIF {"id": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "address": "fa:16:3e:d5:c3:ec", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf061a3e9-a2", "ovs_interfaceid": "f061a3e9-a2f9-4815-92f6-a497f5c756b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.668 2 DEBUG nova.network.os_vif_util [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=f061a3e9-a2f9-4815-92f6-a497f5c756b8,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf061a3e9-a2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.669 2 DEBUG os_vif [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=f061a3e9-a2f9-4815-92f6-a497f5c756b8,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf061a3e9-a2') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.672 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf061a3e9-a2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.677 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=07a6ed75-8af5-4954-9a40-aad180dcd645) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.684 2 INFO os_vif [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=f061a3e9-a2f9-4815-92f6-a497f5c756b8,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf061a3e9-a2')
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.685 2 INFO nova.virt.libvirt.driver [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Deleting instance files /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6_del
Oct 06 14:32:43 compute-0 nova_compute[192903]: 2025-10-06 14:32:43.686 2 INFO nova.virt.libvirt.driver [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Deletion of /var/lib/nova/instances/8e2f40eb-8639-4ced-aea4-ada2fda296e6_del complete
Oct 06 14:32:44 compute-0 nova_compute[192903]: 2025-10-06 14:32:44.200 2 INFO nova.compute.manager [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 06 14:32:44 compute-0 nova_compute[192903]: 2025-10-06 14:32:44.202 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:32:44 compute-0 nova_compute[192903]: 2025-10-06 14:32:44.202 2 DEBUG nova.compute.manager [-] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:32:44 compute-0 nova_compute[192903]: 2025-10-06 14:32:44.203 2 DEBUG nova.network.neutron [-] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:32:44 compute-0 nova_compute[192903]: 2025-10-06 14:32:44.204 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:44 compute-0 nova_compute[192903]: 2025-10-06 14:32:44.364 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:32:44 compute-0 nova_compute[192903]: 2025-10-06 14:32:44.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.114 2 DEBUG nova.compute.manager [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Received event network-vif-unplugged-f061a3e9-a2f9-4815-92f6-a497f5c756b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.114 2 DEBUG oslo_concurrency.lockutils [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.115 2 DEBUG oslo_concurrency.lockutils [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.116 2 DEBUG oslo_concurrency.lockutils [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.116 2 DEBUG nova.compute.manager [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] No waiting events found dispatching network-vif-unplugged-f061a3e9-a2f9-4815-92f6-a497f5c756b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.117 2 DEBUG nova.compute.manager [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Received event network-vif-unplugged-f061a3e9-a2f9-4815-92f6-a497f5c756b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.117 2 DEBUG nova.compute.manager [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Received event network-vif-deleted-f061a3e9-a2f9-4815-92f6-a497f5c756b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.118 2 INFO nova.compute.manager [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Neutron deleted interface f061a3e9-a2f9-4815-92f6-a497f5c756b8; detaching it from the instance and deleting it from the info cache
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.118 2 DEBUG nova.network.neutron [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.129 2 DEBUG nova.network.neutron [-] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:32:45 compute-0 podman[229275]: 2025-10-06 14:32:45.23670534 +0000 UTC m=+0.085928293 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.627 2 DEBUG nova.compute.manager [req-41bf13df-522c-40e7-b316-f1dd18afacd5 req-746315f2-4961-461d-aff4-df95f99dcbaf e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Detach interface failed, port_id=f061a3e9-a2f9-4815-92f6-a497f5c756b8, reason: Instance 8e2f40eb-8639-4ced-aea4-ada2fda296e6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:32:45 compute-0 nova_compute[192903]: 2025-10-06 14:32:45.635 2 INFO nova.compute.manager [-] [instance: 8e2f40eb-8639-4ced-aea4-ada2fda296e6] Took 1.43 seconds to deallocate network for instance.
Oct 06 14:32:46 compute-0 nova_compute[192903]: 2025-10-06 14:32:46.163 2 DEBUG oslo_concurrency.lockutils [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:32:46 compute-0 nova_compute[192903]: 2025-10-06 14:32:46.163 2 DEBUG oslo_concurrency.lockutils [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:32:46 compute-0 nova_compute[192903]: 2025-10-06 14:32:46.168 2 DEBUG oslo_concurrency.lockutils [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:46 compute-0 nova_compute[192903]: 2025-10-06 14:32:46.215 2 INFO nova.scheduler.client.report [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Deleted allocations for instance 8e2f40eb-8639-4ced-aea4-ada2fda296e6
Oct 06 14:32:47 compute-0 nova_compute[192903]: 2025-10-06 14:32:47.243 2 DEBUG oslo_concurrency.lockutils [None req-34d0f7e6-4f16-4a7d-98a5-4369c5168764 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "8e2f40eb-8639-4ced-aea4-ada2fda296e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.885s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:32:48 compute-0 nova_compute[192903]: 2025-10-06 14:32:48.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:49 compute-0 nova_compute[192903]: 2025-10-06 14:32:49.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:50 compute-0 podman[229296]: 2025-10-06 14:32:50.211391377 +0000 UTC m=+0.069448151 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 06 14:32:53 compute-0 nova_compute[192903]: 2025-10-06 14:32:53.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:54 compute-0 nova_compute[192903]: 2025-10-06 14:32:54.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:56.255 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:32:56 compute-0 nova_compute[192903]: 2025-10-06 14:32:56.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:56 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:56.256 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:32:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:32:57.258 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:32:58 compute-0 nova_compute[192903]: 2025-10-06 14:32:58.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:32:59 compute-0 podman[203308]: time="2025-10-06T14:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:32:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:32:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 06 14:32:59 compute-0 nova_compute[192903]: 2025-10-06 14:32:59.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:01 compute-0 openstack_network_exporter[205500]: ERROR   14:33:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:33:01 compute-0 openstack_network_exporter[205500]: ERROR   14:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:33:01 compute-0 openstack_network_exporter[205500]: ERROR   14:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:33:01 compute-0 openstack_network_exporter[205500]: ERROR   14:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:33:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:33:01 compute-0 openstack_network_exporter[205500]: ERROR   14:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:33:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:33:03 compute-0 nova_compute[192903]: 2025-10-06 14:33:03.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:04 compute-0 nova_compute[192903]: 2025-10-06 14:33:04.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:33:04 compute-0 nova_compute[192903]: 2025-10-06 14:33:04.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:05 compute-0 podman[229319]: 2025-10-06 14:33:05.243392665 +0000 UTC m=+0.094180545 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4)
Oct 06 14:33:05 compute-0 podman[229325]: 2025-10-06 14:33:05.253139439 +0000 UTC m=+0.093970650 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:33:05 compute-0 podman[229318]: 2025-10-06 14:33:05.255397952 +0000 UTC m=+0.119051173 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 06 14:33:05 compute-0 podman[229320]: 2025-10-06 14:33:05.272577955 +0000 UTC m=+0.122036288 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:33:08 compute-0 nova_compute[192903]: 2025-10-06 14:33:08.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:33:08 compute-0 nova_compute[192903]: 2025-10-06 14:33:08.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:33:08 compute-0 nova_compute[192903]: 2025-10-06 14:33:08.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.095 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.095 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.301 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.302 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.323 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.324 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5826MB free_disk=73.29996871948242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.324 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.325 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:33:09 compute-0 nova_compute[192903]: 2025-10-06 14:33:09.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:10 compute-0 nova_compute[192903]: 2025-10-06 14:33:10.366 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:33:10 compute-0 nova_compute[192903]: 2025-10-06 14:33:10.367 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:33:09 up  1:34,  0 user,  load average: 0.02, 0.12, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:33:10 compute-0 nova_compute[192903]: 2025-10-06 14:33:10.383 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 14:33:10 compute-0 nova_compute[192903]: 2025-10-06 14:33:10.403 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 14:33:10 compute-0 nova_compute[192903]: 2025-10-06 14:33:10.404 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:33:10 compute-0 nova_compute[192903]: 2025-10-06 14:33:10.419 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 14:33:10 compute-0 nova_compute[192903]: 2025-10-06 14:33:10.436 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 14:33:10 compute-0 nova_compute[192903]: 2025-10-06 14:33:10.462 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:33:10 compute-0 nova_compute[192903]: 2025-10-06 14:33:10.970 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:33:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:11.414 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:33:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:11.414 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:33:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:11.415 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:33:11 compute-0 nova_compute[192903]: 2025-10-06 14:33:11.479 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:33:11 compute-0 nova_compute[192903]: 2025-10-06 14:33:11.480 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.155s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:33:13 compute-0 nova_compute[192903]: 2025-10-06 14:33:13.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:14 compute-0 nova_compute[192903]: 2025-10-06 14:33:14.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:16 compute-0 podman[229408]: 2025-10-06 14:33:16.203819425 +0000 UTC m=+0.063643776 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 06 14:33:18 compute-0 nova_compute[192903]: 2025-10-06 14:33:18.481 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:33:18 compute-0 nova_compute[192903]: 2025-10-06 14:33:18.481 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:33:18 compute-0 nova_compute[192903]: 2025-10-06 14:33:18.481 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:33:18 compute-0 nova_compute[192903]: 2025-10-06 14:33:18.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:33:18 compute-0 nova_compute[192903]: 2025-10-06 14:33:18.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:19 compute-0 nova_compute[192903]: 2025-10-06 14:33:19.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:20 compute-0 nova_compute[192903]: 2025-10-06 14:33:20.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:33:21 compute-0 podman[229428]: 2025-10-06 14:33:21.228445955 +0000 UTC m=+0.087309402 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible)
Oct 06 14:33:21 compute-0 nova_compute[192903]: 2025-10-06 14:33:21.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:33:23 compute-0 nova_compute[192903]: 2025-10-06 14:33:23.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:24 compute-0 nova_compute[192903]: 2025-10-06 14:33:24.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:33:25 compute-0 nova_compute[192903]: 2025-10-06 14:33:25.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:28 compute-0 nova_compute[192903]: 2025-10-06 14:33:28.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:29 compute-0 podman[203308]: time="2025-10-06T14:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:33:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:33:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 06 14:33:30 compute-0 nova_compute[192903]: 2025-10-06 14:33:30.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:31 compute-0 openstack_network_exporter[205500]: ERROR   14:33:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:33:31 compute-0 openstack_network_exporter[205500]: ERROR   14:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:33:31 compute-0 openstack_network_exporter[205500]: ERROR   14:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:33:31 compute-0 openstack_network_exporter[205500]: ERROR   14:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:33:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:33:31 compute-0 openstack_network_exporter[205500]: ERROR   14:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:33:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:33:33 compute-0 nova_compute[192903]: 2025-10-06 14:33:33.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:35 compute-0 nova_compute[192903]: 2025-10-06 14:33:35.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:36 compute-0 podman[229451]: 2025-10-06 14:33:36.21545691 +0000 UTC m=+0.067141236 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 06 14:33:36 compute-0 podman[229452]: 2025-10-06 14:33:36.239726302 +0000 UTC m=+0.080676747 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 14:33:36 compute-0 podman[229450]: 2025-10-06 14:33:36.252367037 +0000 UTC m=+0.110087863 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:33:36 compute-0 podman[229453]: 2025-10-06 14:33:36.266833033 +0000 UTC m=+0.105291248 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:33:38 compute-0 nova_compute[192903]: 2025-10-06 14:33:38.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:40 compute-0 nova_compute[192903]: 2025-10-06 14:33:40.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:43 compute-0 nova_compute[192903]: 2025-10-06 14:33:43.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:44 compute-0 nova_compute[192903]: 2025-10-06 14:33:44.323 2 DEBUG nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Creating tmpfile /var/lib/nova/instances/tmpfi20dqer to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:33:44 compute-0 nova_compute[192903]: 2025-10-06 14:33:44.325 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:33:44 compute-0 nova_compute[192903]: 2025-10-06 14:33:44.330 2 DEBUG nova.compute.manager [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfi20dqer',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:33:45 compute-0 nova_compute[192903]: 2025-10-06 14:33:45.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:46 compute-0 nova_compute[192903]: 2025-10-06 14:33:46.375 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:33:47 compute-0 podman[229535]: 2025-10-06 14:33:47.241173095 +0000 UTC m=+0.095820561 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 06 14:33:48 compute-0 nova_compute[192903]: 2025-10-06 14:33:48.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:50 compute-0 nova_compute[192903]: 2025-10-06 14:33:50.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:50 compute-0 nova_compute[192903]: 2025-10-06 14:33:50.272 2 DEBUG nova.compute.manager [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfi20dqer',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3525fbd-f412-486a-828c-2e3d202a1848',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:33:51 compute-0 nova_compute[192903]: 2025-10-06 14:33:51.292 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-a3525fbd-f412-486a-828c-2e3d202a1848" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:33:51 compute-0 nova_compute[192903]: 2025-10-06 14:33:51.293 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-a3525fbd-f412-486a-828c-2e3d202a1848" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:33:51 compute-0 nova_compute[192903]: 2025-10-06 14:33:51.293 2 DEBUG nova.network.neutron [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:33:51 compute-0 nova_compute[192903]: 2025-10-06 14:33:51.799 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:33:52 compute-0 podman[229556]: 2025-10-06 14:33:52.198474422 +0000 UTC m=+0.065913692 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350)
Oct 06 14:33:52 compute-0 nova_compute[192903]: 2025-10-06 14:33:52.231 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:33:52 compute-0 nova_compute[192903]: 2025-10-06 14:33:52.382 2 DEBUG nova.network.neutron [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Updating instance_info_cache with network_info: [{"id": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "address": "fa:16:3e:ce:97:73", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669f1a21-6e", "ovs_interfaceid": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:33:52 compute-0 nova_compute[192903]: 2025-10-06 14:33:52.888 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-a3525fbd-f412-486a-828c-2e3d202a1848" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:33:52 compute-0 nova_compute[192903]: 2025-10-06 14:33:52.899 2 DEBUG nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfi20dqer',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3525fbd-f412-486a-828c-2e3d202a1848',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:33:52 compute-0 nova_compute[192903]: 2025-10-06 14:33:52.900 2 DEBUG nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Creating instance directory: /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:33:52 compute-0 nova_compute[192903]: 2025-10-06 14:33:52.901 2 DEBUG nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Creating disk.info with the contents: {'/var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk': 'qcow2', '/var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:33:52 compute-0 nova_compute[192903]: 2025-10-06 14:33:52.901 2 DEBUG nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:33:52 compute-0 nova_compute[192903]: 2025-10-06 14:33:52.901 2 DEBUG nova.objects.instance [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid a3525fbd-f412-486a-828c-2e3d202a1848 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.407 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.411 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.413 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.496 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.498 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.499 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.500 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.508 2 DEBUG oslo_utils.imageutils.format_inspector [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.509 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.587 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.589 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.624 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.625 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.626 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.679 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.681 2 DEBUG nova.virt.disk.api [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.682 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.771 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.773 2 DEBUG nova.virt.disk.api [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.773 2 DEBUG nova.objects.instance [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid a3525fbd-f412-486a-828c-2e3d202a1848 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:33:53 compute-0 nova_compute[192903]: 2025-10-06 14:33:53.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.298 2 DEBUG nova.objects.base [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<a3525fbd-f412-486a-828c-2e3d202a1848> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.299 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.339 2 DEBUG oslo_concurrency.processutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk.config 497664" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.341 2 DEBUG nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.343 2 DEBUG nova.virt.libvirt.vif [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:32:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-546658047',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-546658047',id=32,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:33:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0ec553506daf47cdb68df0c86c52faef',ramdisk_id='',reservation_id='r-8p63udk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1983018748',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1983018748-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:33:05Z,user_data=None,user_id='4d0489ee5b894f5e87df2cef154bcd29',uuid=a3525fbd-f412-486a-828c-2e3d202a1848,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "address": "fa:16:3e:ce:97:73", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap669f1a21-6e", "ovs_interfaceid": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.344 2 DEBUG nova.network.os_vif_util [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "address": "fa:16:3e:ce:97:73", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap669f1a21-6e", "ovs_interfaceid": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.346 2 DEBUG nova.network.os_vif_util [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:97:73,bridge_name='br-int',has_traffic_filtering=True,id=669f1a21-6ecc-480a-a72e-d9e7d011cbac,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669f1a21-6e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.347 2 DEBUG os_vif [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:97:73,bridge_name='br-int',has_traffic_filtering=True,id=669f1a21-6ecc-480a-a72e-d9e7d011cbac,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669f1a21-6e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.349 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.349 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.351 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '560cd866-b6a5-50ed-b2bc-80002097130b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.359 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap669f1a21-6e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.360 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap669f1a21-6e, col_values=(('qos', UUID('77b50978-0611-4619-ab51-2dca01c73bc4')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.361 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap669f1a21-6e, col_values=(('external_ids', {'iface-id': '669f1a21-6ecc-480a-a72e-d9e7d011cbac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:97:73', 'vm-uuid': 'a3525fbd-f412-486a-828c-2e3d202a1848'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:54 compute-0 NetworkManager[52035]: <info>  [1759761234.3641] manager: (tap669f1a21-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.373 2 INFO os_vif [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:97:73,bridge_name='br-int',has_traffic_filtering=True,id=669f1a21-6ecc-480a-a72e-d9e7d011cbac,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669f1a21-6e')
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.374 2 DEBUG nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.374 2 DEBUG nova.compute.manager [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfi20dqer',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3525fbd-f412-486a-828c-2e3d202a1848',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.375 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:33:54 compute-0 ovn_controller[95205]: 2025-10-06T14:33:54Z|00289|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Oct 06 14:33:54 compute-0 nova_compute[192903]: 2025-10-06 14:33:54.693 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:33:55 compute-0 nova_compute[192903]: 2025-10-06 14:33:55.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:55 compute-0 nova_compute[192903]: 2025-10-06 14:33:55.784 2 DEBUG nova.network.neutron [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Port 669f1a21-6ecc-480a-a72e-d9e7d011cbac updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:33:55 compute-0 nova_compute[192903]: 2025-10-06 14:33:55.800 2 DEBUG nova.compute.manager [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfi20dqer',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3525fbd-f412-486a-828c-2e3d202a1848',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:33:58 compute-0 kernel: tap669f1a21-6e: entered promiscuous mode
Oct 06 14:33:58 compute-0 NetworkManager[52035]: <info>  [1759761238.7751] manager: (tap669f1a21-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Oct 06 14:33:58 compute-0 ovn_controller[95205]: 2025-10-06T14:33:58Z|00290|binding|INFO|Claiming lport 669f1a21-6ecc-480a-a72e-d9e7d011cbac for this additional chassis.
Oct 06 14:33:58 compute-0 ovn_controller[95205]: 2025-10-06T14:33:58Z|00291|binding|INFO|669f1a21-6ecc-480a-a72e-d9e7d011cbac: Claiming fa:16:3e:ce:97:73 10.100.0.8
Oct 06 14:33:58 compute-0 nova_compute[192903]: 2025-10-06 14:33:58.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.782 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:97:73 10.100.0.8'], port_security=['fa:16:3e:ce:97:73 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a3525fbd-f412-486a-828c-2e3d202a1848', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3486723-d121-43c7-9194-63860e513b31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0ec553506daf47cdb68df0c86c52faef', 'neutron:revision_number': '10', 'neutron:security_group_ids': '992e6d44-ac44-42a2-98c5-27de37a3d90b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab97e14a-5383-4896-8ec6-53d938fe85c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=669f1a21-6ecc-480a-a72e-d9e7d011cbac) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.784 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 669f1a21-6ecc-480a-a72e-d9e7d011cbac in datapath e3486723-d121-43c7-9194-63860e513b31 unbound from our chassis
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.786 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3486723-d121-43c7-9194-63860e513b31
Oct 06 14:33:58 compute-0 nova_compute[192903]: 2025-10-06 14:33:58.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:58 compute-0 ovn_controller[95205]: 2025-10-06T14:33:58Z|00292|binding|INFO|Setting lport 669f1a21-6ecc-480a-a72e-d9e7d011cbac ovn-installed in OVS
Oct 06 14:33:58 compute-0 nova_compute[192903]: 2025-10-06 14:33:58.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:58 compute-0 nova_compute[192903]: 2025-10-06 14:33:58.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:58 compute-0 nova_compute[192903]: 2025-10-06 14:33:58.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.811 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[40d82c67-ef94-45d3-ac44-6a589e368947]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.813 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape3486723-d1 in ovnmeta-e3486723-d121-43c7-9194-63860e513b31 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.815 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape3486723-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.815 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[344f17f6-1c8d-456e-bb4a-455027d6c56a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.816 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ef946e78-d095-40b7-8fd7-eca3879c5b54]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:58 compute-0 systemd-machined[152985]: New machine qemu-26-instance-00000020.
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.834 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[c341014a-f2ae-4311-a8b6-9add4e9bdb33]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:58 compute-0 systemd-udevd[229614]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.854 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d66e14-2877-4439-afdf-acdcd84be584]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:58 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-00000020.
Oct 06 14:33:58 compute-0 NetworkManager[52035]: <info>  [1759761238.8631] device (tap669f1a21-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:33:58 compute-0 NetworkManager[52035]: <info>  [1759761238.8639] device (tap669f1a21-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.908 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[88754993-f91a-43a4-b047-4f3211a5fa4d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:58 compute-0 systemd-udevd[229617]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.915 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[47df97ab-954e-47c4-8fac-ad8b5c9f6c22]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:58 compute-0 NetworkManager[52035]: <info>  [1759761238.9161] manager: (tape3486723-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/103)
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.969 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[68c3c356-15c2-48b6-9a7b-98e73444ec56]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:58 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:58.973 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[be59b6d7-b724-436d-aa83-66914bfa9154]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 NetworkManager[52035]: <info>  [1759761239.0102] device (tape3486723-d0): carrier: link connected
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.018 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[cba60771-620f-4268-abbd-f7318df67f0e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.047 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[076807f2-9c8f-48df-9954-52ad726015fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3486723-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:6a:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570065, 'reachable_time': 33159, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229645, 'error': None, 'target': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.069 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[34a155b6-a7b6-476b-9e7e-e844cba2344e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:6a05'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 570065, 'tstamp': 570065}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229646, 'error': None, 'target': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.089 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d92f38-d25e-4abc-94e5-82eb8a28f9cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3486723-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:6a:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570065, 'reachable_time': 33159, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229647, 'error': None, 'target': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.133 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd2a574-e1a0-41f9-b0de-c15bb41b59b6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.235 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[f8457e39-473c-4a4a-aad3-b9013b367ad9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.241 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3486723-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.242 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.242 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3486723-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:33:59 compute-0 nova_compute[192903]: 2025-10-06 14:33:59.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:59 compute-0 kernel: tape3486723-d0: entered promiscuous mode
Oct 06 14:33:59 compute-0 nova_compute[192903]: 2025-10-06 14:33:59.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:59 compute-0 NetworkManager[52035]: <info>  [1759761239.2503] manager: (tape3486723-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.250 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3486723-d0, col_values=(('external_ids', {'iface-id': '1ac3ed16-bb39-4937-aa85-df05f50a260e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:33:59 compute-0 nova_compute[192903]: 2025-10-06 14:33:59.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:59 compute-0 ovn_controller[95205]: 2025-10-06T14:33:59Z|00293|binding|INFO|Releasing lport 1ac3ed16-bb39-4937-aa85-df05f50a260e from this chassis (sb_readonly=0)
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.254 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[56a15eff-dc4e-4f37-a14c-64b5959d197d]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.255 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.256 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.256 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for e3486723-d121-43c7-9194-63860e513b31 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.256 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.257 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2bec31ff-233f-4bbe-995d-10c21fed64b7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.258 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.258 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[49b5c6bd-04e6-4016-8cca-93a105db6394]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.259 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-e3486723-d121-43c7-9194-63860e513b31
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID e3486723-d121-43c7-9194-63860e513b31
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:33:59 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:33:59.259 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'env', 'PROCESS_TAG=haproxy-e3486723-d121-43c7-9194-63860e513b31', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e3486723-d121-43c7-9194-63860e513b31.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:33:59 compute-0 nova_compute[192903]: 2025-10-06 14:33:59.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:59 compute-0 nova_compute[192903]: 2025-10-06 14:33:59.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:33:59 compute-0 podman[229686]: 2025-10-06 14:33:59.690477022 +0000 UTC m=+0.044848690 container create 25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 14:33:59 compute-0 systemd[1]: Started libpod-conmon-25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0.scope.
Oct 06 14:33:59 compute-0 podman[203308]: time="2025-10-06T14:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:33:59 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:33:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7af9cc5a9e51d21ea24ec71893bef4da7585538dcf825672c239b7c6926c9d8f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:33:59 compute-0 podman[229686]: 2025-10-06 14:33:59.669830962 +0000 UTC m=+0.024202660 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:33:59 compute-0 podman[229686]: 2025-10-06 14:33:59.768841142 +0000 UTC m=+0.123212860 container init 25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 14:33:59 compute-0 podman[229686]: 2025-10-06 14:33:59.774181952 +0000 UTC m=+0.128553620 container start 25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_managed=true)
Oct 06 14:33:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20760 "" "Go-http-client/1.1"
Oct 06 14:33:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3480 "" "Go-http-client/1.1"
Oct 06 14:33:59 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229703]: [NOTICE]   (229707) : New worker (229709) forked
Oct 06 14:33:59 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229703]: [NOTICE]   (229707) : Loading success.
Oct 06 14:34:00 compute-0 nova_compute[192903]: 2025-10-06 14:34:00.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:01 compute-0 openstack_network_exporter[205500]: ERROR   14:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:34:01 compute-0 openstack_network_exporter[205500]: ERROR   14:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:34:01 compute-0 openstack_network_exporter[205500]: ERROR   14:34:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:34:01 compute-0 openstack_network_exporter[205500]: ERROR   14:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:34:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:34:01 compute-0 openstack_network_exporter[205500]: ERROR   14:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:34:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:34:01 compute-0 nova_compute[192903]: 2025-10-06 14:34:01.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:01 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:01.620 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:34:01 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:01.622 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:34:01 compute-0 ovn_controller[95205]: 2025-10-06T14:34:01Z|00294|binding|INFO|Claiming lport 669f1a21-6ecc-480a-a72e-d9e7d011cbac for this chassis.
Oct 06 14:34:01 compute-0 ovn_controller[95205]: 2025-10-06T14:34:01Z|00295|binding|INFO|669f1a21-6ecc-480a-a72e-d9e7d011cbac: Claiming fa:16:3e:ce:97:73 10.100.0.8
Oct 06 14:34:01 compute-0 ovn_controller[95205]: 2025-10-06T14:34:01Z|00296|binding|INFO|Setting lport 669f1a21-6ecc-480a-a72e-d9e7d011cbac up in Southbound
Oct 06 14:34:02 compute-0 nova_compute[192903]: 2025-10-06 14:34:02.772 2 INFO nova.compute.manager [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Post operation of migration started
Oct 06 14:34:02 compute-0 nova_compute[192903]: 2025-10-06 14:34:02.773 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:02 compute-0 nova_compute[192903]: 2025-10-06 14:34:02.863 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:02 compute-0 nova_compute[192903]: 2025-10-06 14:34:02.864 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:02 compute-0 nova_compute[192903]: 2025-10-06 14:34:02.949 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-a3525fbd-f412-486a-828c-2e3d202a1848" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:34:02 compute-0 nova_compute[192903]: 2025-10-06 14:34:02.949 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-a3525fbd-f412-486a-828c-2e3d202a1848" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:34:02 compute-0 nova_compute[192903]: 2025-10-06 14:34:02.950 2 DEBUG nova.network.neutron [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:34:03 compute-0 nova_compute[192903]: 2025-10-06 14:34:03.457 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:03 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:03.624 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:34:03 compute-0 nova_compute[192903]: 2025-10-06 14:34:03.946 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:04 compute-0 nova_compute[192903]: 2025-10-06 14:34:04.077 2 DEBUG nova.network.neutron [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Updating instance_info_cache with network_info: [{"id": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "address": "fa:16:3e:ce:97:73", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669f1a21-6e", "ovs_interfaceid": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:34:04 compute-0 nova_compute[192903]: 2025-10-06 14:34:04.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:04 compute-0 nova_compute[192903]: 2025-10-06 14:34:04.583 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-a3525fbd-f412-486a-828c-2e3d202a1848" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:34:05 compute-0 nova_compute[192903]: 2025-10-06 14:34:05.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:05 compute-0 nova_compute[192903]: 2025-10-06 14:34:05.108 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:34:05 compute-0 nova_compute[192903]: 2025-10-06 14:34:05.109 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:34:05 compute-0 nova_compute[192903]: 2025-10-06 14:34:05.109 2 DEBUG oslo_concurrency.lockutils [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:34:05 compute-0 nova_compute[192903]: 2025-10-06 14:34:05.117 2 INFO nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:34:05 compute-0 virtqemud[192802]: Domain id=26 name='instance-00000020' uuid=a3525fbd-f412-486a-828c-2e3d202a1848 is tainted: custom-monitor
Oct 06 14:34:06 compute-0 nova_compute[192903]: 2025-10-06 14:34:06.125 2 INFO nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:34:06 compute-0 nova_compute[192903]: 2025-10-06 14:34:06.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:07 compute-0 nova_compute[192903]: 2025-10-06 14:34:07.133 2 INFO nova.virt.libvirt.driver [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:34:07 compute-0 nova_compute[192903]: 2025-10-06 14:34:07.138 2 DEBUG nova.compute.manager [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:34:07 compute-0 podman[229735]: 2025-10-06 14:34:07.210310645 +0000 UTC m=+0.063019891 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:34:07 compute-0 podman[229733]: 2025-10-06 14:34:07.22546923 +0000 UTC m=+0.082709743 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 06 14:34:07 compute-0 podman[229734]: 2025-10-06 14:34:07.243237209 +0000 UTC m=+0.098610970 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:34:07 compute-0 podman[229732]: 2025-10-06 14:34:07.252495759 +0000 UTC m=+0.110943616 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 06 14:34:07 compute-0 nova_compute[192903]: 2025-10-06 14:34:07.653 2 DEBUG nova.objects.instance [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:34:08 compute-0 nova_compute[192903]: 2025-10-06 14:34:08.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:08 compute-0 nova_compute[192903]: 2025-10-06 14:34:08.673 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:08 compute-0 nova_compute[192903]: 2025-10-06 14:34:08.822 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:08 compute-0 nova_compute[192903]: 2025-10-06 14:34:08.823 2 WARNING neutronclient.v2_0.client [None req-1e198e25-c9ba-4564-aac5-78f86502865b f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:09 compute-0 nova_compute[192903]: 2025-10-06 14:34:09.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:34:09 compute-0 nova_compute[192903]: 2025-10-06 14:34:09.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:34:09 compute-0 nova_compute[192903]: 2025-10-06 14:34:09.100 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:34:09 compute-0 nova_compute[192903]: 2025-10-06 14:34:09.100 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:34:09 compute-0 nova_compute[192903]: 2025-10-06 14:34:09.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.151 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.244 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.245 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.326 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.517 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.519 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.562 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.563 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5641MB free_disk=73.27109146118164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.563 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:34:10 compute-0 nova_compute[192903]: 2025-10-06 14:34:10.564 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:34:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:11.416 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:34:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:11.416 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:34:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:11.417 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:34:11 compute-0 nova_compute[192903]: 2025-10-06 14:34:11.583 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Applying migration context for instance a3525fbd-f412-486a-828c-2e3d202a1848 as it has an incoming, in-progress migration f91c5cb6-33f5-4d72-a1b9-814cacc942eb. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 06 14:34:11 compute-0 nova_compute[192903]: 2025-10-06 14:34:11.584 2 DEBUG nova.objects.instance [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:34:12 compute-0 nova_compute[192903]: 2025-10-06 14:34:12.091 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 06 14:34:12 compute-0 nova_compute[192903]: 2025-10-06 14:34:12.128 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance a3525fbd-f412-486a-828c-2e3d202a1848 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 1151, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:34:12 compute-0 nova_compute[192903]: 2025-10-06 14:34:12.128 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:34:12 compute-0 nova_compute[192903]: 2025-10-06 14:34:12.129 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:34:10 up  1:35,  0 user,  load average: 0.14, 0.13, 0.22\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_0ec553506daf47cdb68df0c86c52faef': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:34:12 compute-0 nova_compute[192903]: 2025-10-06 14:34:12.170 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:34:12 compute-0 nova_compute[192903]: 2025-10-06 14:34:12.679 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:34:13 compute-0 nova_compute[192903]: 2025-10-06 14:34:13.191 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:34:13 compute-0 nova_compute[192903]: 2025-10-06 14:34:13.191 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.628s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:34:13 compute-0 nova_compute[192903]: 2025-10-06 14:34:13.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:13 compute-0 nova_compute[192903]: 2025-10-06 14:34:13.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:13 compute-0 nova_compute[192903]: 2025-10-06 14:34:13.583 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 14:34:14 compute-0 nova_compute[192903]: 2025-10-06 14:34:14.089 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 14:34:14 compute-0 nova_compute[192903]: 2025-10-06 14:34:14.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:15 compute-0 nova_compute[192903]: 2025-10-06 14:34:15.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:17 compute-0 nova_compute[192903]: 2025-10-06 14:34:17.088 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:17 compute-0 nova_compute[192903]: 2025-10-06 14:34:17.089 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:34:17 compute-0 nova_compute[192903]: 2025-10-06 14:34:17.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:18 compute-0 podman[229825]: 2025-10-06 14:34:18.228811409 +0000 UTC m=+0.092429687 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 06 14:34:19 compute-0 nova_compute[192903]: 2025-10-06 14:34:19.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:20 compute-0 nova_compute[192903]: 2025-10-06 14:34:20.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:21 compute-0 nova_compute[192903]: 2025-10-06 14:34:21.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:21 compute-0 nova_compute[192903]: 2025-10-06 14:34:21.683 2 DEBUG oslo_concurrency.lockutils [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Acquiring lock "a3525fbd-f412-486a-828c-2e3d202a1848" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:34:21 compute-0 nova_compute[192903]: 2025-10-06 14:34:21.684 2 DEBUG oslo_concurrency.lockutils [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "a3525fbd-f412-486a-828c-2e3d202a1848" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:34:21 compute-0 nova_compute[192903]: 2025-10-06 14:34:21.685 2 DEBUG oslo_concurrency.lockutils [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Acquiring lock "a3525fbd-f412-486a-828c-2e3d202a1848-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:34:21 compute-0 nova_compute[192903]: 2025-10-06 14:34:21.685 2 DEBUG oslo_concurrency.lockutils [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "a3525fbd-f412-486a-828c-2e3d202a1848-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:34:21 compute-0 nova_compute[192903]: 2025-10-06 14:34:21.685 2 DEBUG oslo_concurrency.lockutils [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "a3525fbd-f412-486a-828c-2e3d202a1848-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:34:21 compute-0 nova_compute[192903]: 2025-10-06 14:34:21.705 2 INFO nova.compute.manager [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Terminating instance
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.227 2 DEBUG nova.compute.manager [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:34:22 compute-0 kernel: tap669f1a21-6e (unregistering): left promiscuous mode
Oct 06 14:34:22 compute-0 NetworkManager[52035]: <info>  [1759761262.2528] device (tap669f1a21-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:34:22 compute-0 ovn_controller[95205]: 2025-10-06T14:34:22Z|00297|binding|INFO|Releasing lport 669f1a21-6ecc-480a-a72e-d9e7d011cbac from this chassis (sb_readonly=0)
Oct 06 14:34:22 compute-0 ovn_controller[95205]: 2025-10-06T14:34:22Z|00298|binding|INFO|Setting lport 669f1a21-6ecc-480a-a72e-d9e7d011cbac down in Southbound
Oct 06 14:34:22 compute-0 ovn_controller[95205]: 2025-10-06T14:34:22Z|00299|binding|INFO|Removing iface tap669f1a21-6e ovn-installed in OVS
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.268 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:97:73 10.100.0.8'], port_security=['fa:16:3e:ce:97:73 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a3525fbd-f412-486a-828c-2e3d202a1848', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3486723-d121-43c7-9194-63860e513b31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0ec553506daf47cdb68df0c86c52faef', 'neutron:revision_number': '15', 'neutron:security_group_ids': '992e6d44-ac44-42a2-98c5-27de37a3d90b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab97e14a-5383-4896-8ec6-53d938fe85c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=669f1a21-6ecc-480a-a72e-d9e7d011cbac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.269 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 669f1a21-6ecc-480a-a72e-d9e7d011cbac in datapath e3486723-d121-43c7-9194-63860e513b31 unbound from our chassis
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.269 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3486723-d121-43c7-9194-63860e513b31, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.270 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f61556-90b7-43e7-817f-b3185f7e895e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.270 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e3486723-d121-43c7-9194-63860e513b31 namespace which is not needed anymore
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:22 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000020.scope: Deactivated successfully.
Oct 06 14:34:22 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000020.scope: Consumed 2.888s CPU time.
Oct 06 14:34:22 compute-0 podman[229846]: 2025-10-06 14:34:22.336418608 +0000 UTC m=+0.055433457 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Oct 06 14:34:22 compute-0 systemd-machined[152985]: Machine qemu-26-instance-00000020 terminated.
Oct 06 14:34:22 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229703]: [NOTICE]   (229707) : haproxy version is 3.0.5-8e879a5
Oct 06 14:34:22 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229703]: [NOTICE]   (229707) : path to executable is /usr/sbin/haproxy
Oct 06 14:34:22 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229703]: [WARNING]  (229707) : Exiting Master process...
Oct 06 14:34:22 compute-0 podman[229888]: 2025-10-06 14:34:22.392829212 +0000 UTC m=+0.028172202 container kill 25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 06 14:34:22 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229703]: [ALERT]    (229707) : Current worker (229709) exited with code 143 (Terminated)
Oct 06 14:34:22 compute-0 neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31[229703]: [WARNING]  (229707) : All workers exited. Exiting... (0)
Oct 06 14:34:22 compute-0 systemd[1]: libpod-25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0.scope: Deactivated successfully.
Oct 06 14:34:22 compute-0 podman[229904]: 2025-10-06 14:34:22.442590589 +0000 UTC m=+0.026944717 container died 25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:34:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0-userdata-shm.mount: Deactivated successfully.
Oct 06 14:34:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-7af9cc5a9e51d21ea24ec71893bef4da7585538dcf825672c239b7c6926c9d8f-merged.mount: Deactivated successfully.
Oct 06 14:34:22 compute-0 podman[229904]: 2025-10-06 14:34:22.477949302 +0000 UTC m=+0.062303440 container cleanup 25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 06 14:34:22 compute-0 systemd[1]: libpod-conmon-25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0.scope: Deactivated successfully.
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.493 2 INFO nova.virt.libvirt.driver [-] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Instance destroyed successfully.
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.494 2 DEBUG nova.objects.instance [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lazy-loading 'resources' on Instance uuid a3525fbd-f412-486a-828c-2e3d202a1848 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:34:22 compute-0 podman[229906]: 2025-10-06 14:34:22.499137047 +0000 UTC m=+0.077272871 container remove 25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.505 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b20415ff-2623-4cbf-9005-3a50f1f21415]: (4, ("Mon Oct  6 02:34:22 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31 (25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0)\n25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0\nMon Oct  6 02:34:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e3486723-d121-43c7-9194-63860e513b31 (25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0)\n25a0ebde4fa11685d59a2d1da3c948fa53dd4dcd9cb7a76a2ceac9cadaca43a0\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.506 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[4463e07e-3d99-4961-8f3d-0ec380bab58e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.507 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3486723-d121-43c7-9194-63860e513b31.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.507 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[dd6b323e-20e4-4b7f-b8ed-87daa3a046ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.508 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3486723-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:34:22 compute-0 kernel: tape3486723-d0: left promiscuous mode
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.576 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[9e156345-5d27-44b2-9ecc-9bd440f63432]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.601 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[339c3eff-d7cb-4368-b682-6c4e3ded2c52]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.602 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[c444d47e-d8c4-47e2-96ea-2793dfa8c169]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.618 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[f424a1ca-6ab0-4fcf-b7a2-709b5562a334]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570054, 'reachable_time': 15464, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229955, 'error': None, 'target': 'ovnmeta-e3486723-d121-43c7-9194-63860e513b31', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.620 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e3486723-d121-43c7-9194-63860e513b31 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:34:22 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:22.620 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[608c100b-38d2-4c4b-9075-b261c4956851]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:22 compute-0 systemd[1]: run-netns-ovnmeta\x2de3486723\x2dd121\x2d43c7\x2d9194\x2d63860e513b31.mount: Deactivated successfully.
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.759 2 DEBUG nova.compute.manager [req-6642d0b0-8583-493b-93fc-47a785169151 req-e0f3b4a5-840e-4f46-961e-2a60b0bba4ee e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Received event network-vif-unplugged-669f1a21-6ecc-480a-a72e-d9e7d011cbac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.759 2 DEBUG oslo_concurrency.lockutils [req-6642d0b0-8583-493b-93fc-47a785169151 req-e0f3b4a5-840e-4f46-961e-2a60b0bba4ee e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "a3525fbd-f412-486a-828c-2e3d202a1848-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.759 2 DEBUG oslo_concurrency.lockutils [req-6642d0b0-8583-493b-93fc-47a785169151 req-e0f3b4a5-840e-4f46-961e-2a60b0bba4ee e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a3525fbd-f412-486a-828c-2e3d202a1848-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.759 2 DEBUG oslo_concurrency.lockutils [req-6642d0b0-8583-493b-93fc-47a785169151 req-e0f3b4a5-840e-4f46-961e-2a60b0bba4ee e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a3525fbd-f412-486a-828c-2e3d202a1848-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.760 2 DEBUG nova.compute.manager [req-6642d0b0-8583-493b-93fc-47a785169151 req-e0f3b4a5-840e-4f46-961e-2a60b0bba4ee e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] No waiting events found dispatching network-vif-unplugged-669f1a21-6ecc-480a-a72e-d9e7d011cbac pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:34:22 compute-0 nova_compute[192903]: 2025-10-06 14:34:22.760 2 DEBUG nova.compute.manager [req-6642d0b0-8583-493b-93fc-47a785169151 req-e0f3b4a5-840e-4f46-961e-2a60b0bba4ee e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Received event network-vif-unplugged-669f1a21-6ecc-480a-a72e-d9e7d011cbac for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.001 2 DEBUG nova.virt.libvirt.vif [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:32:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-546658047',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-546658047',id=32,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:33:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0ec553506daf47cdb68df0c86c52faef',ramdisk_id='',reservation_id='r-8p63udk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1983018748',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1983018748-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:34:08Z,user_data=None,user_id='4d0489ee5b894f5e87df2cef154bcd29',uuid=a3525fbd-f412-486a-828c-2e3d202a1848,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "address": "fa:16:3e:ce:97:73", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669f1a21-6e", "ovs_interfaceid": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.001 2 DEBUG nova.network.os_vif_util [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Converting VIF {"id": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "address": "fa:16:3e:ce:97:73", "network": {"id": "e3486723-d121-43c7-9194-63860e513b31", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-291239497-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb0ecd786c974c4e9468e41534d63909", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap669f1a21-6e", "ovs_interfaceid": "669f1a21-6ecc-480a-a72e-d9e7d011cbac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.003 2 DEBUG nova.network.os_vif_util [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:97:73,bridge_name='br-int',has_traffic_filtering=True,id=669f1a21-6ecc-480a-a72e-d9e7d011cbac,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669f1a21-6e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.003 2 DEBUG os_vif [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:97:73,bridge_name='br-int',has_traffic_filtering=True,id=669f1a21-6ecc-480a-a72e-d9e7d011cbac,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669f1a21-6e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.007 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap669f1a21-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.013 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=77b50978-0611-4619-ab51-2dca01c73bc4) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.019 2 INFO os_vif [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:97:73,bridge_name='br-int',has_traffic_filtering=True,id=669f1a21-6ecc-480a-a72e-d9e7d011cbac,network=Network(e3486723-d121-43c7-9194-63860e513b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap669f1a21-6e')
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.020 2 INFO nova.virt.libvirt.driver [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Deleting instance files /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848_del
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.021 2 INFO nova.virt.libvirt.driver [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Deletion of /var/lib/nova/instances/a3525fbd-f412-486a-828c-2e3d202a1848_del complete
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.535 2 INFO nova.compute.manager [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.535 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.536 2 DEBUG nova.compute.manager [-] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.536 2 DEBUG nova.network.neutron [-] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.536 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:23 compute-0 nova_compute[192903]: 2025-10-06 14:34:23.700 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.416 2 DEBUG nova.network.neutron [-] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.816 2 DEBUG nova.compute.manager [req-d38d3362-7b67-4e0f-826b-5583b6a546f6 req-c8055ec5-073e-4e32-93ea-1f0b6e9f32be e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Received event network-vif-unplugged-669f1a21-6ecc-480a-a72e-d9e7d011cbac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.817 2 DEBUG oslo_concurrency.lockutils [req-d38d3362-7b67-4e0f-826b-5583b6a546f6 req-c8055ec5-073e-4e32-93ea-1f0b6e9f32be e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "a3525fbd-f412-486a-828c-2e3d202a1848-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.817 2 DEBUG oslo_concurrency.lockutils [req-d38d3362-7b67-4e0f-826b-5583b6a546f6 req-c8055ec5-073e-4e32-93ea-1f0b6e9f32be e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a3525fbd-f412-486a-828c-2e3d202a1848-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.818 2 DEBUG oslo_concurrency.lockutils [req-d38d3362-7b67-4e0f-826b-5583b6a546f6 req-c8055ec5-073e-4e32-93ea-1f0b6e9f32be e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "a3525fbd-f412-486a-828c-2e3d202a1848-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.818 2 DEBUG nova.compute.manager [req-d38d3362-7b67-4e0f-826b-5583b6a546f6 req-c8055ec5-073e-4e32-93ea-1f0b6e9f32be e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] No waiting events found dispatching network-vif-unplugged-669f1a21-6ecc-480a-a72e-d9e7d011cbac pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.819 2 DEBUG nova.compute.manager [req-d38d3362-7b67-4e0f-826b-5583b6a546f6 req-c8055ec5-073e-4e32-93ea-1f0b6e9f32be e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Received event network-vif-unplugged-669f1a21-6ecc-480a-a72e-d9e7d011cbac for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.819 2 DEBUG nova.compute.manager [req-d38d3362-7b67-4e0f-826b-5583b6a546f6 req-c8055ec5-073e-4e32-93ea-1f0b6e9f32be e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Received event network-vif-deleted-669f1a21-6ecc-480a-a72e-d9e7d011cbac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:34:24 compute-0 nova_compute[192903]: 2025-10-06 14:34:24.922 2 INFO nova.compute.manager [-] [instance: a3525fbd-f412-486a-828c-2e3d202a1848] Took 1.39 seconds to deallocate network for instance.
Oct 06 14:34:25 compute-0 nova_compute[192903]: 2025-10-06 14:34:25.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:25 compute-0 nova_compute[192903]: 2025-10-06 14:34:25.449 2 DEBUG oslo_concurrency.lockutils [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:34:25 compute-0 nova_compute[192903]: 2025-10-06 14:34:25.449 2 DEBUG oslo_concurrency.lockutils [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:34:25 compute-0 nova_compute[192903]: 2025-10-06 14:34:25.505 2 DEBUG nova.compute.provider_tree [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:34:26 compute-0 nova_compute[192903]: 2025-10-06 14:34:26.015 2 DEBUG nova.scheduler.client.report [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:34:26 compute-0 nova_compute[192903]: 2025-10-06 14:34:26.527 2 DEBUG oslo_concurrency.lockutils [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.078s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:34:26 compute-0 nova_compute[192903]: 2025-10-06 14:34:26.549 2 INFO nova.scheduler.client.report [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Deleted allocations for instance a3525fbd-f412-486a-828c-2e3d202a1848
Oct 06 14:34:27 compute-0 nova_compute[192903]: 2025-10-06 14:34:27.086 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:27 compute-0 nova_compute[192903]: 2025-10-06 14:34:27.580 2 DEBUG oslo_concurrency.lockutils [None req-8219bab6-b0f9-4cea-b090-98711de18b94 4d0489ee5b894f5e87df2cef154bcd29 0ec553506daf47cdb68df0c86c52faef - - default default] Lock "a3525fbd-f412-486a-828c-2e3d202a1848" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.896s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:34:28 compute-0 nova_compute[192903]: 2025-10-06 14:34:28.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:29 compute-0 podman[203308]: time="2025-10-06T14:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:34:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:34:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 06 14:34:30 compute-0 nova_compute[192903]: 2025-10-06 14:34:30.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:31 compute-0 openstack_network_exporter[205500]: ERROR   14:34:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:34:31 compute-0 openstack_network_exporter[205500]: ERROR   14:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:34:31 compute-0 openstack_network_exporter[205500]: ERROR   14:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:34:31 compute-0 openstack_network_exporter[205500]: ERROR   14:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:34:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:34:31 compute-0 openstack_network_exporter[205500]: ERROR   14:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:34:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:34:33 compute-0 nova_compute[192903]: 2025-10-06 14:34:33.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:35 compute-0 nova_compute[192903]: 2025-10-06 14:34:35.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:35 compute-0 nova_compute[192903]: 2025-10-06 14:34:35.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:38 compute-0 nova_compute[192903]: 2025-10-06 14:34:38.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:38 compute-0 podman[229958]: 2025-10-06 14:34:38.243641249 +0000 UTC m=+0.093375603 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:34:38 compute-0 podman[229960]: 2025-10-06 14:34:38.252491967 +0000 UTC m=+0.090590074 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:34:38 compute-0 podman[229959]: 2025-10-06 14:34:38.258090425 +0000 UTC m=+0.108689543 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 06 14:34:38 compute-0 podman[229957]: 2025-10-06 14:34:38.301504144 +0000 UTC m=+0.159426948 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 06 14:34:40 compute-0 nova_compute[192903]: 2025-10-06 14:34:40.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:43 compute-0 nova_compute[192903]: 2025-10-06 14:34:43.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:43 compute-0 nova_compute[192903]: 2025-10-06 14:34:43.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:34:43 compute-0 nova_compute[192903]: 2025-10-06 14:34:43.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 14:34:45 compute-0 nova_compute[192903]: 2025-10-06 14:34:45.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:46.288 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:a2:2b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-82c89420-2974-43f1-85fe-61a02fb3ca8a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82c89420-2974-43f1-85fe-61a02fb3ca8a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '09cc89e38601466e984593f236ea0c97', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ae9e21-86e2-4c55-aeba-10a3cbe8ea37, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a478e618-2894-4cee-9ded-21faa87cc035) old=Port_Binding(mac=['fa:16:3e:6a:a2:2b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-82c89420-2974-43f1-85fe-61a02fb3ca8a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82c89420-2974-43f1-85fe-61a02fb3ca8a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '09cc89e38601466e984593f236ea0c97', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:34:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:46.289 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a478e618-2894-4cee-9ded-21faa87cc035 in datapath 82c89420-2974-43f1-85fe-61a02fb3ca8a updated
Oct 06 14:34:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:46.289 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82c89420-2974-43f1-85fe-61a02fb3ca8a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:34:46 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:46.290 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[844fb113-266f-4134-9486-27e02ddda300]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:48 compute-0 nova_compute[192903]: 2025-10-06 14:34:48.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:49 compute-0 podman[230042]: 2025-10-06 14:34:49.203378432 +0000 UTC m=+0.069043397 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=iscsid)
Oct 06 14:34:50 compute-0 nova_compute[192903]: 2025-10-06 14:34:50.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:53 compute-0 nova_compute[192903]: 2025-10-06 14:34:53.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:53 compute-0 podman[230063]: 2025-10-06 14:34:53.19213216 +0000 UTC m=+0.055700470 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 06 14:34:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:54.899 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:7d:a1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-51ae6251-13ae-453b-9ef2-a818a2292d67', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51ae6251-13ae-453b-9ef2-a818a2292d67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '847b461d420642eba1ca9b335248a236', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b83d2ac6-ea2b-4b10-8a14-41377e5d8f45, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=65e1ff80-18fd-4f86-a5e6-a3197a9c7b51) old=Port_Binding(mac=['fa:16:3e:93:7d:a1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-51ae6251-13ae-453b-9ef2-a818a2292d67', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51ae6251-13ae-453b-9ef2-a818a2292d67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '847b461d420642eba1ca9b335248a236', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:34:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:54.900 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 65e1ff80-18fd-4f86-a5e6-a3197a9c7b51 in datapath 51ae6251-13ae-453b-9ef2-a818a2292d67 updated
Oct 06 14:34:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:54.901 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51ae6251-13ae-453b-9ef2-a818a2292d67, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:34:54 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:34:54.905 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1e0f5f9c-085b-4fa7-8e9e-adf3abbf7135]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:34:55 compute-0 nova_compute[192903]: 2025-10-06 14:34:55.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:58 compute-0 nova_compute[192903]: 2025-10-06 14:34:58.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:34:59 compute-0 podman[203308]: time="2025-10-06T14:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:34:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:34:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 06 14:35:00 compute-0 nova_compute[192903]: 2025-10-06 14:35:00.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:01 compute-0 openstack_network_exporter[205500]: ERROR   14:35:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:35:01 compute-0 openstack_network_exporter[205500]: ERROR   14:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:35:01 compute-0 openstack_network_exporter[205500]: ERROR   14:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:35:01 compute-0 openstack_network_exporter[205500]: ERROR   14:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:35:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:35:01 compute-0 openstack_network_exporter[205500]: ERROR   14:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:35:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:35:03 compute-0 nova_compute[192903]: 2025-10-06 14:35:03.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:05 compute-0 nova_compute[192903]: 2025-10-06 14:35:05.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:08 compute-0 nova_compute[192903]: 2025-10-06 14:35:08.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:08 compute-0 ovn_controller[95205]: 2025-10-06T14:35:08Z|00300|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.090 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.091 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:35:09 compute-0 podman[230087]: 2025-10-06 14:35:09.231739408 +0000 UTC m=+0.076264420 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 06 14:35:09 compute-0 podman[230086]: 2025-10-06 14:35:09.232709874 +0000 UTC m=+0.085571609 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Oct 06 14:35:09 compute-0 podman[230088]: 2025-10-06 14:35:09.252332849 +0000 UTC m=+0.101948858 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:35:09 compute-0 podman[230085]: 2025-10-06 14:35:09.260743244 +0000 UTC m=+0.120283278 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.606 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.606 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.607 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.607 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:35:09.728 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:35:09 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:35:09.730 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.849 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.851 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.883 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.884 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5847MB free_disk=73.29993438720703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.885 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:35:09 compute-0 nova_compute[192903]: 2025-10-06 14:35:09.885 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:35:10 compute-0 nova_compute[192903]: 2025-10-06 14:35:10.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:10 compute-0 nova_compute[192903]: 2025-10-06 14:35:10.931 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:35:10 compute-0 nova_compute[192903]: 2025-10-06 14:35:10.931 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:35:09 up  1:36,  0 user,  load average: 0.05, 0.10, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:35:11 compute-0 nova_compute[192903]: 2025-10-06 14:35:11.032 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:35:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:35:11.418 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:35:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:35:11.419 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:35:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:35:11.419 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:35:11 compute-0 nova_compute[192903]: 2025-10-06 14:35:11.539 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:35:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:35:11.731 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:35:12 compute-0 nova_compute[192903]: 2025-10-06 14:35:12.047 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:35:12 compute-0 nova_compute[192903]: 2025-10-06 14:35:12.048 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.163s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:35:13 compute-0 nova_compute[192903]: 2025-10-06 14:35:13.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:13 compute-0 nova_compute[192903]: 2025-10-06 14:35:13.535 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:35:15 compute-0 nova_compute[192903]: 2025-10-06 14:35:15.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:16 compute-0 nova_compute[192903]: 2025-10-06 14:35:16.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:35:16 compute-0 nova_compute[192903]: 2025-10-06 14:35:16.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:35:18 compute-0 nova_compute[192903]: 2025-10-06 14:35:18.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:19 compute-0 nova_compute[192903]: 2025-10-06 14:35:19.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:35:20 compute-0 podman[230173]: 2025-10-06 14:35:20.217905083 +0000 UTC m=+0.082073186 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:35:20 compute-0 nova_compute[192903]: 2025-10-06 14:35:20.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:20 compute-0 nova_compute[192903]: 2025-10-06 14:35:20.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:35:21 compute-0 nova_compute[192903]: 2025-10-06 14:35:21.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:35:23 compute-0 nova_compute[192903]: 2025-10-06 14:35:23.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:23 compute-0 nova_compute[192903]: 2025-10-06 14:35:23.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:35:24 compute-0 podman[230194]: 2025-10-06 14:35:24.246012495 +0000 UTC m=+0.100622573 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, architecture=x86_64, vcs-type=git, name=ubi9-minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Oct 06 14:35:25 compute-0 nova_compute[192903]: 2025-10-06 14:35:25.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:28 compute-0 nova_compute[192903]: 2025-10-06 14:35:28.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:28 compute-0 nova_compute[192903]: 2025-10-06 14:35:28.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:35:29 compute-0 podman[203308]: time="2025-10-06T14:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:35:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:35:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Oct 06 14:35:30 compute-0 nova_compute[192903]: 2025-10-06 14:35:30.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:31 compute-0 openstack_network_exporter[205500]: ERROR   14:35:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:35:31 compute-0 openstack_network_exporter[205500]: ERROR   14:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:35:31 compute-0 openstack_network_exporter[205500]: ERROR   14:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:35:31 compute-0 openstack_network_exporter[205500]: ERROR   14:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:35:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:35:31 compute-0 openstack_network_exporter[205500]: ERROR   14:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:35:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:35:33 compute-0 nova_compute[192903]: 2025-10-06 14:35:33.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:35 compute-0 nova_compute[192903]: 2025-10-06 14:35:35.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:38 compute-0 nova_compute[192903]: 2025-10-06 14:35:38.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:40 compute-0 podman[230218]: 2025-10-06 14:35:40.220990133 +0000 UTC m=+0.073957689 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 06 14:35:40 compute-0 podman[230219]: 2025-10-06 14:35:40.231973877 +0000 UTC m=+0.073250060 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:35:40 compute-0 podman[230216]: 2025-10-06 14:35:40.246458554 +0000 UTC m=+0.103977102 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:35:40 compute-0 podman[230217]: 2025-10-06 14:35:40.258077245 +0000 UTC m=+0.105796381 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 06 14:35:40 compute-0 nova_compute[192903]: 2025-10-06 14:35:40.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:43 compute-0 nova_compute[192903]: 2025-10-06 14:35:43.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:45 compute-0 nova_compute[192903]: 2025-10-06 14:35:45.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:48 compute-0 nova_compute[192903]: 2025-10-06 14:35:48.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:50 compute-0 nova_compute[192903]: 2025-10-06 14:35:50.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:51 compute-0 podman[230302]: 2025-10-06 14:35:51.223073375 +0000 UTC m=+0.077272898 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 06 14:35:53 compute-0 nova_compute[192903]: 2025-10-06 14:35:53.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:55 compute-0 podman[230323]: 2025-10-06 14:35:55.247904449 +0000 UTC m=+0.097047967 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 06 14:35:55 compute-0 nova_compute[192903]: 2025-10-06 14:35:55.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:58 compute-0 nova_compute[192903]: 2025-10-06 14:35:58.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:35:59 compute-0 podman[203308]: time="2025-10-06T14:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:35:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:35:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3022 "" "Go-http-client/1.1"
Oct 06 14:36:00 compute-0 nova_compute[192903]: 2025-10-06 14:36:00.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:01 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 06 14:36:01 compute-0 openstack_network_exporter[205500]: ERROR   14:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:36:01 compute-0 openstack_network_exporter[205500]: ERROR   14:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:36:01 compute-0 openstack_network_exporter[205500]: ERROR   14:36:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:36:01 compute-0 openstack_network_exporter[205500]: ERROR   14:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:36:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:36:01 compute-0 openstack_network_exporter[205500]: ERROR   14:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:36:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:36:03 compute-0 nova_compute[192903]: 2025-10-06 14:36:03.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:05 compute-0 nova_compute[192903]: 2025-10-06 14:36:05.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:08 compute-0 nova_compute[192903]: 2025-10-06 14:36:08.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:08 compute-0 nova_compute[192903]: 2025-10-06 14:36:08.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.098 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.271 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.272 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.289 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.290 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5868MB free_disk=73.29997634887695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.290 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:36:09 compute-0 nova_compute[192903]: 2025-10-06 14:36:09.291 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:36:10 compute-0 nova_compute[192903]: 2025-10-06 14:36:10.367 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:36:10 compute-0 nova_compute[192903]: 2025-10-06 14:36:10.368 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:36:09 up  1:37,  0 user,  load average: 0.02, 0.08, 0.19\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:36:10 compute-0 nova_compute[192903]: 2025-10-06 14:36:10.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:10 compute-0 nova_compute[192903]: 2025-10-06 14:36:10.397 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:36:10 compute-0 nova_compute[192903]: 2025-10-06 14:36:10.906 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:36:11 compute-0 podman[230348]: 2025-10-06 14:36:11.215384068 +0000 UTC m=+0.056870682 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:36:11 compute-0 podman[230347]: 2025-10-06 14:36:11.22707489 +0000 UTC m=+0.072875230 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Oct 06 14:36:11 compute-0 podman[230349]: 2025-10-06 14:36:11.237085468 +0000 UTC m=+0.084146551 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:36:11 compute-0 podman[230346]: 2025-10-06 14:36:11.266732371 +0000 UTC m=+0.119755934 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 06 14:36:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:11.420 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:36:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:11.420 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:36:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:11.420 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:36:11 compute-0 nova_compute[192903]: 2025-10-06 14:36:11.428 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:36:11 compute-0 nova_compute[192903]: 2025-10-06 14:36:11.429 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.138s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:36:13 compute-0 nova_compute[192903]: 2025-10-06 14:36:13.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:13 compute-0 nova_compute[192903]: 2025-10-06 14:36:13.425 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:36:13 compute-0 nova_compute[192903]: 2025-10-06 14:36:13.425 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:36:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:14.601 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:36:14 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:14.601 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:36:14 compute-0 nova_compute[192903]: 2025-10-06 14:36:14.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:15 compute-0 nova_compute[192903]: 2025-10-06 14:36:15.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:17 compute-0 nova_compute[192903]: 2025-10-06 14:36:17.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:36:17 compute-0 nova_compute[192903]: 2025-10-06 14:36:17.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:36:18 compute-0 nova_compute[192903]: 2025-10-06 14:36:18.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:18 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:18.603 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:36:20 compute-0 nova_compute[192903]: 2025-10-06 14:36:20.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:20 compute-0 nova_compute[192903]: 2025-10-06 14:36:20.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:36:22 compute-0 podman[230434]: 2025-10-06 14:36:22.213671346 +0000 UTC m=+0.077967856 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, tcib_build_tag=watcher_latest)
Oct 06 14:36:23 compute-0 nova_compute[192903]: 2025-10-06 14:36:23.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:23 compute-0 nova_compute[192903]: 2025-10-06 14:36:23.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:36:23 compute-0 nova_compute[192903]: 2025-10-06 14:36:23.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:36:25 compute-0 nova_compute[192903]: 2025-10-06 14:36:25.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:26 compute-0 podman[230454]: 2025-10-06 14:36:26.240073842 +0000 UTC m=+0.097175020 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Oct 06 14:36:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:28.045 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:a2:ca 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e0cb46ea43d49c481c08810585a4a3f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33ee0085-afaf-49a4-a46e-ec822f622bc0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2f6f5e26-febd-486b-b124-db45f0c7630e) old=Port_Binding(mac=['fa:16:3e:f7:a2:ca'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e0cb46ea43d49c481c08810585a4a3f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:36:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:28.047 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2f6f5e26-febd-486b-b124-db45f0c7630e in datapath 8adc727e-02ae-4b04-987e-6e7497c7d5bb updated
Oct 06 14:36:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:28.047 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8adc727e-02ae-4b04-987e-6e7497c7d5bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:36:28 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:28.049 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc29f7b-f11a-43dc-80a7-3551141ae63c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:36:28 compute-0 nova_compute[192903]: 2025-10-06 14:36:28.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:29 compute-0 nova_compute[192903]: 2025-10-06 14:36:29.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:36:29 compute-0 podman[203308]: time="2025-10-06T14:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:36:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:36:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 06 14:36:30 compute-0 nova_compute[192903]: 2025-10-06 14:36:30.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:31 compute-0 openstack_network_exporter[205500]: ERROR   14:36:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:36:31 compute-0 openstack_network_exporter[205500]: ERROR   14:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:36:31 compute-0 openstack_network_exporter[205500]: ERROR   14:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:36:31 compute-0 openstack_network_exporter[205500]: ERROR   14:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:36:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:36:31 compute-0 openstack_network_exporter[205500]: ERROR   14:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:36:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:36:33 compute-0 nova_compute[192903]: 2025-10-06 14:36:33.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:34 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:34.474 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:a0:65 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7263d81d-2126-4bab-9c3c-130fc209a07c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7263d81d-2126-4bab-9c3c-130fc209a07c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b34a3d22a4184efeac2c24f99e35e57b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b887a08-5058-43f3-b5a0-f755a94d2252, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=812a18fa-1586-482d-94b5-e41357ab6d72) old=Port_Binding(mac=['fa:16:3e:a8:a0:65'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7263d81d-2126-4bab-9c3c-130fc209a07c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7263d81d-2126-4bab-9c3c-130fc209a07c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b34a3d22a4184efeac2c24f99e35e57b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:36:34 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:34.475 104072 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 812a18fa-1586-482d-94b5-e41357ab6d72 in datapath 7263d81d-2126-4bab-9c3c-130fc209a07c updated
Oct 06 14:36:34 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:34.475 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7263d81d-2126-4bab-9c3c-130fc209a07c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:36:34 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:36:34.476 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[942ed22d-eb4c-41b4-85c0-d7cd9ceb75a8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:36:35 compute-0 nova_compute[192903]: 2025-10-06 14:36:35.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:38 compute-0 nova_compute[192903]: 2025-10-06 14:36:38.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:40 compute-0 nova_compute[192903]: 2025-10-06 14:36:40.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:42 compute-0 podman[230479]: 2025-10-06 14:36:42.223319513 +0000 UTC m=+0.069823199 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 06 14:36:42 compute-0 podman[230480]: 2025-10-06 14:36:42.233658159 +0000 UTC m=+0.081731697 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 06 14:36:42 compute-0 podman[230481]: 2025-10-06 14:36:42.234863642 +0000 UTC m=+0.076582460 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:36:42 compute-0 podman[230478]: 2025-10-06 14:36:42.271606224 +0000 UTC m=+0.123438422 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 14:36:43 compute-0 nova_compute[192903]: 2025-10-06 14:36:43.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:45 compute-0 nova_compute[192903]: 2025-10-06 14:36:45.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:48 compute-0 nova_compute[192903]: 2025-10-06 14:36:48.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:50 compute-0 nova_compute[192903]: 2025-10-06 14:36:50.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:53 compute-0 nova_compute[192903]: 2025-10-06 14:36:53.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:53 compute-0 podman[230560]: 2025-10-06 14:36:53.176630728 +0000 UTC m=+0.048817406 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 14:36:55 compute-0 nova_compute[192903]: 2025-10-06 14:36:55.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:57 compute-0 podman[230580]: 2025-10-06 14:36:57.203195519 +0000 UTC m=+0.066517900 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc.)
Oct 06 14:36:58 compute-0 nova_compute[192903]: 2025-10-06 14:36:58.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:36:59 compute-0 podman[203308]: time="2025-10-06T14:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:36:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:36:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 06 14:37:00 compute-0 nova_compute[192903]: 2025-10-06 14:37:00.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:01 compute-0 openstack_network_exporter[205500]: ERROR   14:37:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:37:01 compute-0 openstack_network_exporter[205500]: ERROR   14:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:37:01 compute-0 openstack_network_exporter[205500]: ERROR   14:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:37:01 compute-0 openstack_network_exporter[205500]: ERROR   14:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:37:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:37:01 compute-0 openstack_network_exporter[205500]: ERROR   14:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:37:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:37:03 compute-0 nova_compute[192903]: 2025-10-06 14:37:03.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:05 compute-0 nova_compute[192903]: 2025-10-06 14:37:05.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:07 compute-0 nova_compute[192903]: 2025-10-06 14:37:07.334 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:07 compute-0 nova_compute[192903]: 2025-10-06 14:37:07.335 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:07 compute-0 nova_compute[192903]: 2025-10-06 14:37:07.845 2 DEBUG nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 06 14:37:08 compute-0 nova_compute[192903]: 2025-10-06 14:37:08.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:08 compute-0 nova_compute[192903]: 2025-10-06 14:37:08.427 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:08 compute-0 nova_compute[192903]: 2025-10-06 14:37:08.427 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:08 compute-0 nova_compute[192903]: 2025-10-06 14:37:08.437 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 06 14:37:08 compute-0 nova_compute[192903]: 2025-10-06 14:37:08.438 2 INFO nova.compute.claims [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Claim successful on node compute-0.ctlplane.example.com
Oct 06 14:37:09 compute-0 nova_compute[192903]: 2025-10-06 14:37:09.511 2 DEBUG nova.compute.provider_tree [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:37:10 compute-0 nova_compute[192903]: 2025-10-06 14:37:10.020 2 DEBUG nova.scheduler.client.report [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:37:10 compute-0 nova_compute[192903]: 2025-10-06 14:37:10.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:10 compute-0 nova_compute[192903]: 2025-10-06 14:37:10.534 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:10 compute-0 nova_compute[192903]: 2025-10-06 14:37:10.535 2 DEBUG nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 06 14:37:10 compute-0 nova_compute[192903]: 2025-10-06 14:37:10.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:37:10 compute-0 nova_compute[192903]: 2025-10-06 14:37:10.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.049 2 DEBUG nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.050 2 DEBUG nova.network.neutron [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.050 2 WARNING neutronclient.v2_0.client [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.051 2 WARNING neutronclient.v2_0.client [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.093 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.094 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.095 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.298 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.299 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.329 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.330 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5874MB free_disk=73.29995727539062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.331 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.331 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:11.421 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:11.421 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:11.422 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:11 compute-0 nova_compute[192903]: 2025-10-06 14:37:11.563 2 INFO nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 06 14:37:12 compute-0 nova_compute[192903]: 2025-10-06 14:37:12.074 2 DEBUG nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 06 14:37:12 compute-0 nova_compute[192903]: 2025-10-06 14:37:12.379 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance 05558b3e-f0ce-4e92-a78b-9c680daac7cb actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:37:12 compute-0 nova_compute[192903]: 2025-10-06 14:37:12.380 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:37:12 compute-0 nova_compute[192903]: 2025-10-06 14:37:12.380 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:37:11 up  1:38,  0 user,  load average: 0.15, 0.10, 0.18\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_b34a3d22a4184efeac2c24f99e35e57b': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:37:12 compute-0 nova_compute[192903]: 2025-10-06 14:37:12.424 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:37:12 compute-0 nova_compute[192903]: 2025-10-06 14:37:12.935 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.096 2 DEBUG nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.097 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.097 2 INFO nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Creating image(s)
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.098 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "/var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.098 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "/var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.099 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "/var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.100 2 DEBUG oslo_utils.imageutils.format_inspector [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.102 2 DEBUG oslo_utils.imageutils.format_inspector [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.105 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.191 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.192 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.192 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.193 2 DEBUG oslo_utils.imageutils.format_inspector [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.195 2 DEBUG oslo_utils.imageutils.format_inspector [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.196 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:13 compute-0 podman[230608]: 2025-10-06 14:37:13.216498424 +0000 UTC m=+0.067432575 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:37:13 compute-0 podman[230606]: 2025-10-06 14:37:13.225075803 +0000 UTC m=+0.078036678 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.236 2 DEBUG nova.network.neutron [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Successfully created port: cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 06 14:37:13 compute-0 podman[230605]: 2025-10-06 14:37:13.247461522 +0000 UTC m=+0.108874673 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 14:37:13 compute-0 podman[230607]: 2025-10-06 14:37:13.252040484 +0000 UTC m=+0.100342444 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.262 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.263 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.294 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.296 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.297 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.351 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.353 2 DEBUG nova.virt.disk.api [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Checking if we can resize image /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.354 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.438 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.440 2 DEBUG nova.virt.disk.api [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Cannot resize image /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.441 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.441 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Ensure instance console log exists: /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.443 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.444 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.445 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.453 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:37:13 compute-0 nova_compute[192903]: 2025-10-06 14:37:13.454 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:14 compute-0 nova_compute[192903]: 2025-10-06 14:37:14.454 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:37:14 compute-0 nova_compute[192903]: 2025-10-06 14:37:14.472 2 DEBUG nova.network.neutron [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Successfully updated port: cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 06 14:37:14 compute-0 nova_compute[192903]: 2025-10-06 14:37:14.574 2 DEBUG nova.compute.manager [req-8a645c7c-d80c-41cb-af90-3d8b90f7b167 req-95517819-db59-4304-ab5a-e871364445b0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Received event network-changed-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:37:14 compute-0 nova_compute[192903]: 2025-10-06 14:37:14.574 2 DEBUG nova.compute.manager [req-8a645c7c-d80c-41cb-af90-3d8b90f7b167 req-95517819-db59-4304-ab5a-e871364445b0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Refreshing instance network info cache due to event network-changed-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 06 14:37:14 compute-0 nova_compute[192903]: 2025-10-06 14:37:14.575 2 DEBUG oslo_concurrency.lockutils [req-8a645c7c-d80c-41cb-af90-3d8b90f7b167 req-95517819-db59-4304-ab5a-e871364445b0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-05558b3e-f0ce-4e92-a78b-9c680daac7cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:37:14 compute-0 nova_compute[192903]: 2025-10-06 14:37:14.575 2 DEBUG oslo_concurrency.lockutils [req-8a645c7c-d80c-41cb-af90-3d8b90f7b167 req-95517819-db59-4304-ab5a-e871364445b0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-05558b3e-f0ce-4e92-a78b-9c680daac7cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:37:14 compute-0 nova_compute[192903]: 2025-10-06 14:37:14.575 2 DEBUG nova.network.neutron [req-8a645c7c-d80c-41cb-af90-3d8b90f7b167 req-95517819-db59-4304-ab5a-e871364445b0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Refreshing network info cache for port cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 06 14:37:14 compute-0 nova_compute[192903]: 2025-10-06 14:37:14.980 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "refresh_cache-05558b3e-f0ce-4e92-a78b-9c680daac7cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:37:15 compute-0 nova_compute[192903]: 2025-10-06 14:37:15.081 2 WARNING neutronclient.v2_0.client [req-8a645c7c-d80c-41cb-af90-3d8b90f7b167 req-95517819-db59-4304-ab5a-e871364445b0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:15 compute-0 nova_compute[192903]: 2025-10-06 14:37:15.157 2 DEBUG nova.network.neutron [req-8a645c7c-d80c-41cb-af90-3d8b90f7b167 req-95517819-db59-4304-ab5a-e871364445b0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:37:15 compute-0 nova_compute[192903]: 2025-10-06 14:37:15.305 2 DEBUG nova.network.neutron [req-8a645c7c-d80c-41cb-af90-3d8b90f7b167 req-95517819-db59-4304-ab5a-e871364445b0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:37:15 compute-0 nova_compute[192903]: 2025-10-06 14:37:15.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:15 compute-0 nova_compute[192903]: 2025-10-06 14:37:15.813 2 DEBUG oslo_concurrency.lockutils [req-8a645c7c-d80c-41cb-af90-3d8b90f7b167 req-95517819-db59-4304-ab5a-e871364445b0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-05558b3e-f0ce-4e92-a78b-9c680daac7cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:37:15 compute-0 nova_compute[192903]: 2025-10-06 14:37:15.814 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquired lock "refresh_cache-05558b3e-f0ce-4e92-a78b-9c680daac7cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:37:15 compute-0 nova_compute[192903]: 2025-10-06 14:37:15.815 2 DEBUG nova.network.neutron [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:37:16 compute-0 nova_compute[192903]: 2025-10-06 14:37:16.915 2 DEBUG nova.network.neutron [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.123 2 WARNING neutronclient.v2_0.client [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.321 2 DEBUG nova.network.neutron [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Updating instance_info_cache with network_info: [{"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.831 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Releasing lock "refresh_cache-05558b3e-f0ce-4e92-a78b-9c680daac7cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.832 2 DEBUG nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Instance network_info: |[{"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.835 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Start _get_guest_xml network_info=[{"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'size': 0, 'image_id': '22f1b7c7-d15f-4caf-8898-de5e10b0ea89'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.841 2 WARNING nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.843 2 DEBUG nova.virt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1865230288', uuid='05558b3e-f0ce-4e92-a78b-9c680daac7cb'), owner=OwnerMeta(userid='4b10d6387fc3489689b6a36a963ab9f4', username='tempest-TestExecuteZoneMigrationStrategy-1465827508-project-admin', projectid='b34a3d22a4184efeac2c24f99e35e57b', projectname='tempest-TestExecuteZoneMigrationStrategy-1465827508'), image=ImageMeta(id='22f1b7c7-d15f-4caf-8898-de5e10b0ea89', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251002161230.cc74260.el10', creation_time=1759761437.8434215) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.849 2 DEBUG nova.virt.libvirt.host [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.850 2 DEBUG nova.virt.libvirt.host [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.852 2 DEBUG nova.virt.libvirt.host [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.853 2 DEBUG nova.virt.libvirt.host [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.854 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.854 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T13:52:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8cb06c85-e9e7-417f-906b-1f7cf29f7de9',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-06T13:52:42Z,direct_url=<?>,disk_format='qcow2',id=22f1b7c7-d15f-4caf-8898-de5e10b0ea89,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fd142f68afa1489aa76784748e93db34',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-06T13:52:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.854 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.855 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.855 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.855 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.855 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.855 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.856 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.856 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.856 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.856 2 DEBUG nova.virt.hardware [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.860 2 DEBUG nova.virt.libvirt.vif [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:37:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1865230288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1865230288',id=37,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b34a3d22a4184efeac2c24f99e35e57b',ramdisk_id='',reservation_id='r-ohdzl21t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1465827508',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1465827508-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:37:12Z,user_data=None,user_id='4b10d6387fc3489689b6a36a963ab9f4',uuid=05558b3e-f0ce-4e92-a78b-9c680daac7cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.860 2 DEBUG nova.network.os_vif_util [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Converting VIF {"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.861 2 DEBUG nova.network.os_vif_util [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfed6b36-c3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:37:17 compute-0 nova_compute[192903]: 2025-10-06 14:37:17.862 2 DEBUG nova.objects.instance [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lazy-loading 'pci_devices' on Instance uuid 05558b3e-f0ce-4e92-a78b-9c680daac7cb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.369 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] End _get_guest_xml xml=<domain type="kvm">
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <uuid>05558b3e-f0ce-4e92-a78b-9c680daac7cb</uuid>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <name>instance-00000025</name>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <memory>131072</memory>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <vcpu>1</vcpu>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <metadata>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <nova:package version="32.1.0-0.20251002161230.cc74260.el10"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1865230288</nova:name>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <nova:creationTime>2025-10-06 14:37:17</nova:creationTime>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <nova:flavor name="m1.nano" id="8cb06c85-e9e7-417f-906b-1f7cf29f7de9">
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:memory>128</nova:memory>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:disk>1</nova:disk>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:swap>0</nova:swap>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:ephemeral>0</nova:ephemeral>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:vcpus>1</nova:vcpus>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:extraSpecs>
Oct 06 14:37:18 compute-0 nova_compute[192903]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         </nova:extraSpecs>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       </nova:flavor>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <nova:image uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89">
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:minDisk>1</nova:minDisk>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:minRam>0</nova:minRam>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:properties>
Oct 06 14:37:18 compute-0 nova_compute[192903]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         </nova:properties>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       </nova:image>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <nova:owner>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:user uuid="4b10d6387fc3489689b6a36a963ab9f4">tempest-TestExecuteZoneMigrationStrategy-1465827508-project-admin</nova:user>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:project uuid="b34a3d22a4184efeac2c24f99e35e57b">tempest-TestExecuteZoneMigrationStrategy-1465827508</nova:project>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       </nova:owner>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <nova:root type="image" uuid="22f1b7c7-d15f-4caf-8898-de5e10b0ea89"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <nova:ports>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         <nova:port uuid="cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d">
Oct 06 14:37:18 compute-0 nova_compute[192903]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:         </nova:port>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       </nova:ports>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     </nova:instance>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   </metadata>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <sysinfo type="smbios">
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <system>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <entry name="manufacturer">RDO</entry>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <entry name="product">OpenStack Compute</entry>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <entry name="version">32.1.0-0.20251002161230.cc74260.el10</entry>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <entry name="serial">05558b3e-f0ce-4e92-a78b-9c680daac7cb</entry>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <entry name="uuid">05558b3e-f0ce-4e92-a78b-9c680daac7cb</entry>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <entry name="family">Virtual Machine</entry>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     </system>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   </sysinfo>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <os>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <boot dev="hd"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <smbios mode="sysinfo"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   </os>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <features>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <acpi/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <apic/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <vmcoreinfo/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   </features>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <clock offset="utc">
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <timer name="pit" tickpolicy="delay"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <timer name="hpet" present="no"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   </clock>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <cpu mode="host-model" match="exact">
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <topology sockets="1" cores="1" threads="1"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   </cpu>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   <devices>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <disk type="file" device="disk">
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <target dev="vda" bus="virtio"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <disk type="file" device="cdrom">
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <driver name="qemu" type="raw" cache="none"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <source file="/var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk.config"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <target dev="sda" bus="sata"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     </disk>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <interface type="ethernet">
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <mac address="fa:16:3e:bd:fd:c8"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <driver name="vhost" rx_queue_size="512"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <mtu size="1442"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <target dev="tapcfed6b36-c3"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     </interface>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <serial type="pty">
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <log file="/var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/console.log" append="off"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     </serial>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <video>
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <model type="virtio"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     </video>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <input type="tablet" bus="usb"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <rng model="virtio">
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <backend model="random">/dev/urandom</backend>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     </rng>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="pci" model="pcie-root-port"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <controller type="usb" index="0"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 06 14:37:18 compute-0 nova_compute[192903]:       <stats period="10"/>
Oct 06 14:37:18 compute-0 nova_compute[192903]:     </memballoon>
Oct 06 14:37:18 compute-0 nova_compute[192903]:   </devices>
Oct 06 14:37:18 compute-0 nova_compute[192903]: </domain>
Oct 06 14:37:18 compute-0 nova_compute[192903]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.371 2 DEBUG nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Preparing to wait for external event network-vif-plugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.371 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.372 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.372 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.373 2 DEBUG nova.virt.libvirt.vif [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-06T14:37:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1865230288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1865230288',id=37,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b34a3d22a4184efeac2c24f99e35e57b',ramdisk_id='',reservation_id='r-ohdzl21t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1465827508',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1465827508-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:37:12Z,user_data=None,user_id='4b10d6387fc3489689b6a36a963ab9f4',uuid=05558b3e-f0ce-4e92-a78b-9c680daac7cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.373 2 DEBUG nova.network.os_vif_util [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Converting VIF {"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.374 2 DEBUG nova.network.os_vif_util [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfed6b36-c3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.374 2 DEBUG os_vif [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfed6b36-c3') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.375 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.375 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.376 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6bd989e4-5548-5f7d-98c6-0278acfba12b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.382 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfed6b36-c3, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.383 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapcfed6b36-c3, col_values=(('qos', UUID('e5f9c3d4-d42a-4dc9-af5a-aeaadbcf0a9b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.383 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapcfed6b36-c3, col_values=(('external_ids', {'iface-id': 'cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:fd:c8', 'vm-uuid': '05558b3e-f0ce-4e92-a78b-9c680daac7cb'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:18 compute-0 NetworkManager[52035]: <info>  [1759761438.3855] manager: (tapcfed6b36-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:18 compute-0 nova_compute[192903]: 2025-10-06 14:37:18.393 2 INFO os_vif [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfed6b36-c3')
Oct 06 14:37:19 compute-0 nova_compute[192903]: 2025-10-06 14:37:19.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:37:19 compute-0 nova_compute[192903]: 2025-10-06 14:37:19.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:37:19 compute-0 nova_compute[192903]: 2025-10-06 14:37:19.950 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:37:19 compute-0 nova_compute[192903]: 2025-10-06 14:37:19.950 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 06 14:37:19 compute-0 nova_compute[192903]: 2025-10-06 14:37:19.951 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] No VIF found with MAC fa:16:3e:bd:fd:c8, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 06 14:37:19 compute-0 nova_compute[192903]: 2025-10-06 14:37:19.952 2 INFO nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Using config drive
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.464 2 WARNING neutronclient.v2_0.client [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.603 2 INFO nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Creating config drive at /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk.config
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.612 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpmga_b982 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.740 2 DEBUG oslo_concurrency.processutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251002161230.cc74260.el10 -quiet -J -r -V config-2 /tmp/tmpmga_b982" returned: 0 in 0.128s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:20 compute-0 kernel: tapcfed6b36-c3: entered promiscuous mode
Oct 06 14:37:20 compute-0 NetworkManager[52035]: <info>  [1759761440.8205] manager: (tapcfed6b36-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Oct 06 14:37:20 compute-0 ovn_controller[95205]: 2025-10-06T14:37:20Z|00301|binding|INFO|Claiming lport cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d for this chassis.
Oct 06 14:37:20 compute-0 ovn_controller[95205]: 2025-10-06T14:37:20Z|00302|binding|INFO|cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d: Claiming fa:16:3e:bd:fd:c8 10.100.0.3
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.848 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:fd:c8 10.100.0.3'], port_security=['fa:16:3e:bd:fd:c8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '05558b3e-f0ce-4e92-a78b-9c680daac7cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b34a3d22a4184efeac2c24f99e35e57b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '55c8d8e8-c6c8-4e9c-a831-7aa8bca32dc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33ee0085-afaf-49a4-a46e-ec822f622bc0, chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.849 104072 INFO neutron.agent.ovn.metadata.agent [-] Port cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d in datapath 8adc727e-02ae-4b04-987e-6e7497c7d5bb bound to our chassis
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.850 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8adc727e-02ae-4b04-987e-6e7497c7d5bb
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.862 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[0674bd69-ee00-42f9-91a7-a811e69628f6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.862 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8adc727e-01 in ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.864 214189 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8adc727e-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.865 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0fe59a-a8e5-4a26-a10c-5c736bb571dd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.866 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[d92e593a-d6bd-4f0b-930b-0d695e9f65af]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:20 compute-0 systemd-machined[152985]: New machine qemu-27-instance-00000025.
Oct 06 14:37:20 compute-0 systemd-udevd[230723]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:37:20 compute-0 NetworkManager[52035]: <info>  [1759761440.8814] device (tapcfed6b36-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.879 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[7639f3b2-64c4-4c05-ba9e-04cf9927fb46]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:20 compute-0 NetworkManager[52035]: <info>  [1759761440.8825] device (tapcfed6b36-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:37:20 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-00000025.
Oct 06 14:37:20 compute-0 ovn_controller[95205]: 2025-10-06T14:37:20Z|00303|binding|INFO|Setting lport cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d ovn-installed in OVS
Oct 06 14:37:20 compute-0 ovn_controller[95205]: 2025-10-06T14:37:20Z|00304|binding|INFO|Setting lport cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d up in Southbound
Oct 06 14:37:20 compute-0 nova_compute[192903]: 2025-10-06 14:37:20.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.897 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b56905ac-c5a0-4d3a-866e-1259d4ba289a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.924 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[128e9d6d-47d3-4e20-8c0f-a5caacdf663a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.929 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c32867-8920-4a07-af89-32ca7478c8c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:20 compute-0 NetworkManager[52035]: <info>  [1759761440.9302] manager: (tap8adc727e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Oct 06 14:37:20 compute-0 systemd-udevd[230726]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.964 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[10d407dd-1fa6-4c34-84ca-0811933b2d91]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:20 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:20.967 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ff9dc9-9ae3-4059-a4af-62aa388db2bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:20 compute-0 NetworkManager[52035]: <info>  [1759761440.9899] device (tap8adc727e-00): carrier: link connected
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.000 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[de19a706-5c26-4a02-8758-2c0d86527ae8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.021 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[292fffbe-a351-443b-acec-1b086158d6a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8adc727e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:a2:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590263, 'reachable_time': 39411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230755, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.044 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2ea4c5-a0f7-4a7c-9d69-3c4f6478f18c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:a2ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590263, 'tstamp': 590263}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230756, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.070 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[543b3e81-0486-4f99-bc01-ebe7ef0c50e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8adc727e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:a2:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590263, 'reachable_time': 39411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230757, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.114 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[2ebe0d53-06d7-4a50-8f56-5d01a48b7eba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.205 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[ca507cbf-3138-466e-9310-91a4fcb8c414]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.206 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8adc727e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.206 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.207 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8adc727e-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:21 compute-0 nova_compute[192903]: 2025-10-06 14:37:21.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:21 compute-0 NetworkManager[52035]: <info>  [1759761441.2096] manager: (tap8adc727e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Oct 06 14:37:21 compute-0 kernel: tap8adc727e-00: entered promiscuous mode
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.212 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8adc727e-00, col_values=(('external_ids', {'iface-id': '2f6f5e26-febd-486b-b124-db45f0c7630e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:21 compute-0 ovn_controller[95205]: 2025-10-06T14:37:21Z|00305|binding|INFO|Releasing lport 2f6f5e26-febd-486b-b124-db45f0c7630e from this chassis (sb_readonly=0)
Oct 06 14:37:21 compute-0 nova_compute[192903]: 2025-10-06 14:37:21.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:21 compute-0 nova_compute[192903]: 2025-10-06 14:37:21.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.228 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[1a3d5e45-c458-4d41-9e83-3a55c32d7c42]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.229 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.229 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.229 104072 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8adc727e-02ae-4b04-987e-6e7497c7d5bb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.230 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.230 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[62c011fe-a8e3-4f4c-beb6-1f272211dde5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.231 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.231 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[151eecbd-4e9f-4928-a981-013024f14c9d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.231 104072 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: global
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     log         /dev/log local0 debug
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     log-tag     haproxy-metadata-proxy-8adc727e-02ae-4b04-987e-6e7497c7d5bb
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     user        root
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     group       root
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     maxconn     1024
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     pidfile     /var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     daemon
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: defaults
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     log global
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     mode http
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     option httplog
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     option dontlognull
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     option http-server-close
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     option forwardfor
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     retries                 3
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     timeout http-request    30s
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     timeout connect         30s
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     timeout client          32s
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     timeout server          32s
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     timeout http-keep-alive 30s
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: listen listener
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     bind 169.254.169.254:80
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     server metadata /var/lib/neutron/metadata_proxy
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:     http-request add-header X-OVN-Network-ID 8adc727e-02ae-4b04-987e-6e7497c7d5bb
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 06 14:37:21 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:21.232 104072 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'env', 'PROCESS_TAG=haproxy-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8adc727e-02ae-4b04-987e-6e7497c7d5bb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 06 14:37:21 compute-0 podman[230789]: 2025-10-06 14:37:21.673837416 +0000 UTC m=+0.073293391 container create 687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 06 14:37:21 compute-0 systemd[1]: Started libpod-conmon-687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30.scope.
Oct 06 14:37:21 compute-0 podman[230789]: 2025-10-06 14:37:21.641535472 +0000 UTC m=+0.040991497 image pull 2aa2ccafff90160e5b202a20e05978c0da57458df68f2a2f36450c3da1cd45e7 38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 06 14:37:21 compute-0 systemd[1]: Started libcrun container.
Oct 06 14:37:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c46aa0621b18a5e65384ddd38e5645b271482372f9beaf098f0d3a84c96011b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 06 14:37:21 compute-0 podman[230789]: 2025-10-06 14:37:21.773619995 +0000 UTC m=+0.173076060 container init 687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 06 14:37:21 compute-0 podman[230789]: 2025-10-06 14:37:21.780095278 +0000 UTC m=+0.179551303 container start 687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 06 14:37:21 compute-0 neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb[230803]: [NOTICE]   (230813) : New worker (230816) forked
Oct 06 14:37:21 compute-0 neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb[230803]: [NOTICE]   (230813) : Loading success.
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.183 2 DEBUG nova.compute.manager [req-16b6a9ce-c1d0-492c-a369-f2004a1aa4c9 req-74086463-d0b7-42f1-929b-786257c56b4f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Received event network-vif-plugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.183 2 DEBUG oslo_concurrency.lockutils [req-16b6a9ce-c1d0-492c-a369-f2004a1aa4c9 req-74086463-d0b7-42f1-929b-786257c56b4f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.184 2 DEBUG oslo_concurrency.lockutils [req-16b6a9ce-c1d0-492c-a369-f2004a1aa4c9 req-74086463-d0b7-42f1-929b-786257c56b4f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.184 2 DEBUG oslo_concurrency.lockutils [req-16b6a9ce-c1d0-492c-a369-f2004a1aa4c9 req-74086463-d0b7-42f1-929b-786257c56b4f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.184 2 DEBUG nova.compute.manager [req-16b6a9ce-c1d0-492c-a369-f2004a1aa4c9 req-74086463-d0b7-42f1-929b-786257c56b4f e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Processing event network-vif-plugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.239 2 DEBUG nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.243 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.246 2 INFO nova.virt.libvirt.driver [-] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Instance spawned successfully.
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.246 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.759 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.760 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.760 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.760 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.761 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:37:22 compute-0 nova_compute[192903]: 2025-10-06 14:37:22.761 2 DEBUG nova.virt.libvirt.driver [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 06 14:37:23 compute-0 nova_compute[192903]: 2025-10-06 14:37:23.275 2 INFO nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Took 10.18 seconds to spawn the instance on the hypervisor.
Oct 06 14:37:23 compute-0 nova_compute[192903]: 2025-10-06 14:37:23.275 2 DEBUG nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:37:23 compute-0 nova_compute[192903]: 2025-10-06 14:37:23.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:23 compute-0 nova_compute[192903]: 2025-10-06 14:37:23.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:37:23 compute-0 nova_compute[192903]: 2025-10-06 14:37:23.811 2 INFO nova.compute.manager [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Took 15.45 seconds to build instance.
Oct 06 14:37:24 compute-0 podman[230826]: 2025-10-06 14:37:24.2415883 +0000 UTC m=+0.091697773 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:37:24 compute-0 nova_compute[192903]: 2025-10-06 14:37:24.253 2 DEBUG nova.compute.manager [req-351ef075-f30c-4ce2-85ec-83023a42b293 req-f32c093f-f9bd-40bc-acfb-64304cf93ebb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Received event network-vif-plugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:37:24 compute-0 nova_compute[192903]: 2025-10-06 14:37:24.253 2 DEBUG oslo_concurrency.lockutils [req-351ef075-f30c-4ce2-85ec-83023a42b293 req-f32c093f-f9bd-40bc-acfb-64304cf93ebb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:24 compute-0 nova_compute[192903]: 2025-10-06 14:37:24.254 2 DEBUG oslo_concurrency.lockutils [req-351ef075-f30c-4ce2-85ec-83023a42b293 req-f32c093f-f9bd-40bc-acfb-64304cf93ebb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:24 compute-0 nova_compute[192903]: 2025-10-06 14:37:24.255 2 DEBUG oslo_concurrency.lockutils [req-351ef075-f30c-4ce2-85ec-83023a42b293 req-f32c093f-f9bd-40bc-acfb-64304cf93ebb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:24 compute-0 nova_compute[192903]: 2025-10-06 14:37:24.255 2 DEBUG nova.compute.manager [req-351ef075-f30c-4ce2-85ec-83023a42b293 req-f32c093f-f9bd-40bc-acfb-64304cf93ebb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] No waiting events found dispatching network-vif-plugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:37:24 compute-0 nova_compute[192903]: 2025-10-06 14:37:24.255 2 WARNING nova.compute.manager [req-351ef075-f30c-4ce2-85ec-83023a42b293 req-f32c093f-f9bd-40bc-acfb-64304cf93ebb e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Received unexpected event network-vif-plugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d for instance with vm_state active and task_state None.
Oct 06 14:37:24 compute-0 nova_compute[192903]: 2025-10-06 14:37:24.316 2 DEBUG oslo_concurrency.lockutils [None req-96f0cfab-0e49-4ab2-9a85-652e61c8daed 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.982s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:25 compute-0 nova_compute[192903]: 2025-10-06 14:37:25.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:25 compute-0 nova_compute[192903]: 2025-10-06 14:37:25.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:37:28 compute-0 podman[230847]: 2025-10-06 14:37:28.251106564 +0000 UTC m=+0.108267006 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64)
Oct 06 14:37:28 compute-0 nova_compute[192903]: 2025-10-06 14:37:28.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:29 compute-0 nova_compute[192903]: 2025-10-06 14:37:29.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:37:29 compute-0 podman[203308]: time="2025-10-06T14:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:37:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:37:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3477 "" "Go-http-client/1.1"
Oct 06 14:37:30 compute-0 nova_compute[192903]: 2025-10-06 14:37:30.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:31 compute-0 openstack_network_exporter[205500]: ERROR   14:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:37:31 compute-0 openstack_network_exporter[205500]: ERROR   14:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:37:31 compute-0 openstack_network_exporter[205500]: ERROR   14:37:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:37:31 compute-0 openstack_network_exporter[205500]: ERROR   14:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:37:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:37:31 compute-0 openstack_network_exporter[205500]: ERROR   14:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:37:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:37:33 compute-0 nova_compute[192903]: 2025-10-06 14:37:33.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:34 compute-0 ovn_controller[95205]: 2025-10-06T14:37:34Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bd:fd:c8 10.100.0.3
Oct 06 14:37:34 compute-0 ovn_controller[95205]: 2025-10-06T14:37:34Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bd:fd:c8 10.100.0.3
Oct 06 14:37:35 compute-0 nova_compute[192903]: 2025-10-06 14:37:35.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:36 compute-0 nova_compute[192903]: 2025-10-06 14:37:36.860 2 DEBUG nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Creating tmpfile /var/lib/nova/instances/tmphhx0ntf1 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 06 14:37:36 compute-0 nova_compute[192903]: 2025-10-06 14:37:36.861 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:36 compute-0 nova_compute[192903]: 2025-10-06 14:37:36.876 2 DEBUG nova.compute.manager [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphhx0ntf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 06 14:37:38 compute-0 nova_compute[192903]: 2025-10-06 14:37:38.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:38 compute-0 nova_compute[192903]: 2025-10-06 14:37:38.926 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:40 compute-0 nova_compute[192903]: 2025-10-06 14:37:40.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:42 compute-0 nova_compute[192903]: 2025-10-06 14:37:42.731 2 DEBUG nova.compute.manager [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphhx0ntf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 06 14:37:43 compute-0 nova_compute[192903]: 2025-10-06 14:37:43.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:43 compute-0 nova_compute[192903]: 2025-10-06 14:37:43.751 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:37:43 compute-0 nova_compute[192903]: 2025-10-06 14:37:43.752 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:37:43 compute-0 nova_compute[192903]: 2025-10-06 14:37:43.752 2 DEBUG nova.network.neutron [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:37:44 compute-0 podman[230882]: 2025-10-06 14:37:44.201781937 +0000 UTC m=+0.061257259 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:37:44 compute-0 podman[230881]: 2025-10-06 14:37:44.223671642 +0000 UTC m=+0.088312082 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4)
Oct 06 14:37:44 compute-0 podman[230894]: 2025-10-06 14:37:44.240889233 +0000 UTC m=+0.072469289 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:37:44 compute-0 podman[230888]: 2025-10-06 14:37:44.241012846 +0000 UTC m=+0.080227776 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 06 14:37:44 compute-0 nova_compute[192903]: 2025-10-06 14:37:44.260 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:45 compute-0 nova_compute[192903]: 2025-10-06 14:37:45.093 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:45 compute-0 nova_compute[192903]: 2025-10-06 14:37:45.235 2 DEBUG nova.network.neutron [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Updating instance_info_cache with network_info: [{"id": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "address": "fa:16:3e:db:f1:a6", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f914008-9f", "ovs_interfaceid": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:37:45 compute-0 nova_compute[192903]: 2025-10-06 14:37:45.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:45 compute-0 nova_compute[192903]: 2025-10-06 14:37:45.743 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:37:45 compute-0 nova_compute[192903]: 2025-10-06 14:37:45.760 2 DEBUG nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphhx0ntf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 06 14:37:45 compute-0 nova_compute[192903]: 2025-10-06 14:37:45.761 2 DEBUG nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Creating instance directory: /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 06 14:37:45 compute-0 nova_compute[192903]: 2025-10-06 14:37:45.762 2 DEBUG nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Creating disk.info with the contents: {'/var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk': 'qcow2', '/var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 06 14:37:45 compute-0 nova_compute[192903]: 2025-10-06 14:37:45.763 2 DEBUG nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 06 14:37:45 compute-0 nova_compute[192903]: 2025-10-06 14:37:45.763 2 DEBUG nova.objects.instance [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'trusted_certs' on Instance uuid ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.272 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.275 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.276 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.347 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.348 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.348 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.349 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.352 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.352 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.409 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.411 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.462 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3,backing_fmt=raw /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.464 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "16d9b63b0a1c17047664d4c69e07c724f73dffd3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.465 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.527 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/16d9b63b0a1c17047664d4c69e07c724f73dffd3 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.529 2 DEBUG nova.virt.disk.api [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Checking if we can resize image /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.530 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.586 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.587 2 DEBUG nova.virt.disk.api [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Cannot resize image /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 06 14:37:46 compute-0 nova_compute[192903]: 2025-10-06 14:37:46.588 2 DEBUG nova.objects.instance [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lazy-loading 'migration_context' on Instance uuid ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.098 2 DEBUG nova.objects.base [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Object Instance<ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.099 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.140 2 DEBUG oslo_concurrency.processutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk.config 497664" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.141 2 DEBUG nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.143 2 DEBUG nova.virt.libvirt.vif [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-06T14:36:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1568413757',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1568413757',id=36,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:36:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b34a3d22a4184efeac2c24f99e35e57b',ramdisk_id='',reservation_id='r-4v1b19jk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1465827508',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1465827508-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T14:37:00Z,user_data=None,user_id='4b10d6387fc3489689b6a36a963ab9f4',uuid=ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "address": "fa:16:3e:db:f1:a6", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3f914008-9f", "ovs_interfaceid": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.144 2 DEBUG nova.network.os_vif_util [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converting VIF {"id": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "address": "fa:16:3e:db:f1:a6", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3f914008-9f", "ovs_interfaceid": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.145 2 DEBUG nova.network.os_vif_util [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:f1:a6,bridge_name='br-int',has_traffic_filtering=True,id=3f914008-9fbc-428e-8aa8-f80f42de5f4f,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f914008-9f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.146 2 DEBUG os_vif [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:f1:a6,bridge_name='br-int',has_traffic_filtering=True,id=3f914008-9fbc-428e-8aa8-f80f42de5f4f,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f914008-9f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.148 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.149 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.150 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '847c9c24-1b32-5c5a-b016-769114e3d91d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.158 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f914008-9f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.159 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3f914008-9f, col_values=(('qos', UUID('365b8723-fe55-408d-816a-4fa15438bfdb')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.159 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3f914008-9f, col_values=(('external_ids', {'iface-id': '3f914008-9fbc-428e-8aa8-f80f42de5f4f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:f1:a6', 'vm-uuid': 'ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:47 compute-0 NetworkManager[52035]: <info>  [1759761467.1626] manager: (tap3f914008-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.172 2 INFO os_vif [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:f1:a6,bridge_name='br-int',has_traffic_filtering=True,id=3f914008-9fbc-428e-8aa8-f80f42de5f4f,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f914008-9f')
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.173 2 DEBUG nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.174 2 DEBUG nova.compute.manager [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphhx0ntf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.175 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:47 compute-0 nova_compute[192903]: 2025-10-06 14:37:47.263 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:48 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:48.011 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:37:48 compute-0 nova_compute[192903]: 2025-10-06 14:37:48.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:48 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:48.013 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:37:48 compute-0 nova_compute[192903]: 2025-10-06 14:37:48.811 2 DEBUG nova.network.neutron [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Port 3f914008-9fbc-428e-8aa8-f80f42de5f4f updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 06 14:37:48 compute-0 nova_compute[192903]: 2025-10-06 14:37:48.827 2 DEBUG nova.compute.manager [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphhx0ntf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 06 14:37:50 compute-0 nova_compute[192903]: 2025-10-06 14:37:50.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:51 compute-0 ovn_controller[95205]: 2025-10-06T14:37:51Z|00306|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 06 14:37:51 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 06 14:37:51 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 06 14:37:51 compute-0 NetworkManager[52035]: <info>  [1759761471.9421] manager: (tap3f914008-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Oct 06 14:37:51 compute-0 kernel: tap3f914008-9f: entered promiscuous mode
Oct 06 14:37:51 compute-0 ovn_controller[95205]: 2025-10-06T14:37:51Z|00307|binding|INFO|Claiming lport 3f914008-9fbc-428e-8aa8-f80f42de5f4f for this additional chassis.
Oct 06 14:37:51 compute-0 ovn_controller[95205]: 2025-10-06T14:37:51Z|00308|binding|INFO|3f914008-9fbc-428e-8aa8-f80f42de5f4f: Claiming fa:16:3e:db:f1:a6 10.100.0.4
Oct 06 14:37:51 compute-0 nova_compute[192903]: 2025-10-06 14:37:51.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:51.951 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:f1:a6 10.100.0.4'], port_security=['fa:16:3e:db:f1:a6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b34a3d22a4184efeac2c24f99e35e57b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '55c8d8e8-c6c8-4e9c-a831-7aa8bca32dc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33ee0085-afaf-49a4-a46e-ec822f622bc0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3f914008-9fbc-428e-8aa8-f80f42de5f4f) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:37:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:51.952 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 3f914008-9fbc-428e-8aa8-f80f42de5f4f in datapath 8adc727e-02ae-4b04-987e-6e7497c7d5bb unbound from our chassis
Oct 06 14:37:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:51.954 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8adc727e-02ae-4b04-987e-6e7497c7d5bb
Oct 06 14:37:51 compute-0 ovn_controller[95205]: 2025-10-06T14:37:51Z|00309|binding|INFO|Setting lport 3f914008-9fbc-428e-8aa8-f80f42de5f4f ovn-installed in OVS
Oct 06 14:37:51 compute-0 nova_compute[192903]: 2025-10-06 14:37:51.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:51 compute-0 nova_compute[192903]: 2025-10-06 14:37:51.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:51 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:51.985 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[10d3c87d-0145-4167-8034-080164b02f32]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:51 compute-0 systemd-udevd[231019]: Network interface NamePolicy= disabled on kernel command line.
Oct 06 14:37:52 compute-0 NetworkManager[52035]: <info>  [1759761472.0104] device (tap3f914008-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 06 14:37:52 compute-0 NetworkManager[52035]: <info>  [1759761472.0113] device (tap3f914008-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 06 14:37:52 compute-0 systemd-machined[152985]: New machine qemu-28-instance-00000024.
Oct 06 14:37:52 compute-0 systemd[1]: Started Virtual Machine qemu-28-instance-00000024.
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.029 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[dad90d65-26c9-4b04-b662-fd2680f0cb34]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.036 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[79077eaf-81a0-4ce6-9c8c-8db74314677b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.070 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[60d148a2-3086-4bea-b92a-99c1c79b6915]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.094 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[dd46ac4c-90b4-4545-a0b4-93aaa9aa1134]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8adc727e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:a2:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590263, 'reachable_time': 39411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231034, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.115 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[65af539c-8e7d-4db9-856d-4f63356f4b92]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8adc727e-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590279, 'tstamp': 590279}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231036, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8adc727e-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590283, 'tstamp': 590283}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231036, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.117 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8adc727e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:52 compute-0 nova_compute[192903]: 2025-10-06 14:37:52.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.120 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8adc727e-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.120 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.120 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8adc727e-00, col_values=(('external_ids', {'iface-id': '2f6f5e26-febd-486b-b124-db45f0c7630e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:52 compute-0 nova_compute[192903]: 2025-10-06 14:37:52.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.121 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:37:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:52.122 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[362e84d7-bdfd-4ba9-b83b-c1dd33e8a1d9]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8adc727e-02ae-4b04-987e-6e7497c7d5bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8adc727e-02ae-4b04-987e-6e7497c7d5bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:37:52 compute-0 nova_compute[192903]: 2025-10-06 14:37:52.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:55 compute-0 podman[231057]: 2025-10-06 14:37:55.244386412 +0000 UTC m=+0.091194721 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2)
Oct 06 14:37:55 compute-0 nova_compute[192903]: 2025-10-06 14:37:55.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:56 compute-0 ovn_controller[95205]: 2025-10-06T14:37:56Z|00310|binding|INFO|Claiming lport 3f914008-9fbc-428e-8aa8-f80f42de5f4f for this chassis.
Oct 06 14:37:56 compute-0 ovn_controller[95205]: 2025-10-06T14:37:56Z|00311|binding|INFO|3f914008-9fbc-428e-8aa8-f80f42de5f4f: Claiming fa:16:3e:db:f1:a6 10.100.0.4
Oct 06 14:37:56 compute-0 ovn_controller[95205]: 2025-10-06T14:37:56Z|00312|binding|INFO|Setting lport 3f914008-9fbc-428e-8aa8-f80f42de5f4f up in Southbound
Oct 06 14:37:57 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:37:57.014 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:37:57 compute-0 nova_compute[192903]: 2025-10-06 14:37:57.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:37:57 compute-0 nova_compute[192903]: 2025-10-06 14:37:57.359 2 INFO nova.compute.manager [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Post operation of migration started
Oct 06 14:37:57 compute-0 nova_compute[192903]: 2025-10-06 14:37:57.360 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:57 compute-0 nova_compute[192903]: 2025-10-06 14:37:57.839 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:57 compute-0 nova_compute[192903]: 2025-10-06 14:37:57.840 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:57 compute-0 nova_compute[192903]: 2025-10-06 14:37:57.914 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "refresh_cache-ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 06 14:37:57 compute-0 nova_compute[192903]: 2025-10-06 14:37:57.914 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquired lock "refresh_cache-ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 06 14:37:57 compute-0 nova_compute[192903]: 2025-10-06 14:37:57.915 2 DEBUG nova.network.neutron [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 06 14:37:58 compute-0 nova_compute[192903]: 2025-10-06 14:37:58.421 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:59 compute-0 nova_compute[192903]: 2025-10-06 14:37:59.085 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:37:59 compute-0 podman[231079]: 2025-10-06 14:37:59.236925361 +0000 UTC m=+0.087792299 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 06 14:37:59 compute-0 nova_compute[192903]: 2025-10-06 14:37:59.262 2 DEBUG nova.network.neutron [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Updating instance_info_cache with network_info: [{"id": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "address": "fa:16:3e:db:f1:a6", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f914008-9f", "ovs_interfaceid": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:37:59 compute-0 podman[203308]: time="2025-10-06T14:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:37:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20764 "" "Go-http-client/1.1"
Oct 06 14:37:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Oct 06 14:37:59 compute-0 nova_compute[192903]: 2025-10-06 14:37:59.768 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Releasing lock "refresh_cache-ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 06 14:38:00 compute-0 nova_compute[192903]: 2025-10-06 14:38:00.284 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:00 compute-0 nova_compute[192903]: 2025-10-06 14:38:00.285 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:00 compute-0 nova_compute[192903]: 2025-10-06 14:38:00.286 2 DEBUG oslo_concurrency.lockutils [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:00 compute-0 nova_compute[192903]: 2025-10-06 14:38:00.292 2 INFO nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 06 14:38:00 compute-0 virtqemud[192802]: Domain id=28 name='instance-00000024' uuid=ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c is tainted: custom-monitor
Oct 06 14:38:00 compute-0 nova_compute[192903]: 2025-10-06 14:38:00.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:01 compute-0 nova_compute[192903]: 2025-10-06 14:38:01.303 2 INFO nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 06 14:38:01 compute-0 openstack_network_exporter[205500]: ERROR   14:38:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:38:01 compute-0 openstack_network_exporter[205500]: ERROR   14:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:38:01 compute-0 openstack_network_exporter[205500]: ERROR   14:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:38:01 compute-0 openstack_network_exporter[205500]: ERROR   14:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:38:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:38:01 compute-0 openstack_network_exporter[205500]: ERROR   14:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:38:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:38:02 compute-0 nova_compute[192903]: 2025-10-06 14:38:02.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:02 compute-0 nova_compute[192903]: 2025-10-06 14:38:02.310 2 INFO nova.virt.libvirt.driver [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 06 14:38:02 compute-0 nova_compute[192903]: 2025-10-06 14:38:02.317 2 DEBUG nova.compute.manager [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 06 14:38:02 compute-0 nova_compute[192903]: 2025-10-06 14:38:02.828 2 DEBUG nova.objects.instance [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 06 14:38:03 compute-0 nova_compute[192903]: 2025-10-06 14:38:03.848 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:38:04 compute-0 nova_compute[192903]: 2025-10-06 14:38:04.353 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:38:04 compute-0 nova_compute[192903]: 2025-10-06 14:38:04.354 2 WARNING neutronclient.v2_0.client [None req-6412aa4f-c246-4033-9a49-c7042e3c4976 f9cb50fc4a1347d8b4d2c8c244d73b61 69cf06613b5f44abb7aa9d7e505adede - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:38:05 compute-0 nova_compute[192903]: 2025-10-06 14:38:05.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:06 compute-0 nova_compute[192903]: 2025-10-06 14:38:06.997 2 DEBUG oslo_concurrency.lockutils [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:06 compute-0 nova_compute[192903]: 2025-10-06 14:38:06.998 2 DEBUG oslo_concurrency.lockutils [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:06 compute-0 nova_compute[192903]: 2025-10-06 14:38:06.998 2 DEBUG oslo_concurrency.lockutils [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:06 compute-0 nova_compute[192903]: 2025-10-06 14:38:06.998 2 DEBUG oslo_concurrency.lockutils [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:06 compute-0 nova_compute[192903]: 2025-10-06 14:38:06.998 2 DEBUG oslo_concurrency.lockutils [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.013 2 INFO nova.compute.manager [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Terminating instance
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.536 2 DEBUG nova.compute.manager [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:38:07 compute-0 kernel: tapcfed6b36-c3 (unregistering): left promiscuous mode
Oct 06 14:38:07 compute-0 NetworkManager[52035]: <info>  [1759761487.5742] device (tapcfed6b36-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:07 compute-0 ovn_controller[95205]: 2025-10-06T14:38:07Z|00313|binding|INFO|Releasing lport cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d from this chassis (sb_readonly=0)
Oct 06 14:38:07 compute-0 ovn_controller[95205]: 2025-10-06T14:38:07Z|00314|binding|INFO|Setting lport cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d down in Southbound
Oct 06 14:38:07 compute-0 ovn_controller[95205]: 2025-10-06T14:38:07Z|00315|binding|INFO|Removing iface tapcfed6b36-c3 ovn-installed in OVS
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.614 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:fd:c8 10.100.0.3'], port_security=['fa:16:3e:bd:fd:c8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '05558b3e-f0ce-4e92-a78b-9c680daac7cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b34a3d22a4184efeac2c24f99e35e57b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '55c8d8e8-c6c8-4e9c-a831-7aa8bca32dc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33ee0085-afaf-49a4-a46e-ec822f622bc0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.615 104072 INFO neutron.agent.ovn.metadata.agent [-] Port cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d in datapath 8adc727e-02ae-4b04-987e-6e7497c7d5bb unbound from our chassis
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.619 104072 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8adc727e-02ae-4b04-987e-6e7497c7d5bb
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.645 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[940932f7-63ec-442a-846e-781af7744077]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:07 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000025.scope: Deactivated successfully.
Oct 06 14:38:07 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000025.scope: Consumed 14.467s CPU time.
Oct 06 14:38:07 compute-0 systemd-machined[152985]: Machine qemu-27-instance-00000025 terminated.
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.691 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[82d1de19-f3d8-45da-b5f6-943605dc2ade]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.694 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[a7724011-1915-44d0-9f11-034e0ca68908]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.732 216656 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1af19a-1878-4b7c-b647-6edf3ae5e8ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.732 2 DEBUG nova.compute.manager [req-77212911-cfa4-446b-b187-bfe7a106d2e0 req-cc61880e-97db-4507-855e-2ba2cca671a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Received event network-vif-unplugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.733 2 DEBUG oslo_concurrency.lockutils [req-77212911-cfa4-446b-b187-bfe7a106d2e0 req-cc61880e-97db-4507-855e-2ba2cca671a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.733 2 DEBUG oslo_concurrency.lockutils [req-77212911-cfa4-446b-b187-bfe7a106d2e0 req-cc61880e-97db-4507-855e-2ba2cca671a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.734 2 DEBUG oslo_concurrency.lockutils [req-77212911-cfa4-446b-b187-bfe7a106d2e0 req-cc61880e-97db-4507-855e-2ba2cca671a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.734 2 DEBUG nova.compute.manager [req-77212911-cfa4-446b-b187-bfe7a106d2e0 req-cc61880e-97db-4507-855e-2ba2cca671a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] No waiting events found dispatching network-vif-unplugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.735 2 DEBUG nova.compute.manager [req-77212911-cfa4-446b-b187-bfe7a106d2e0 req-cc61880e-97db-4507-855e-2ba2cca671a1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Received event network-vif-unplugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.756 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb79969-73d5-4ad0-af0a-0db5b63fdd2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8adc727e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:a2:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590263, 'reachable_time': 39411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231112, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.778 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a585bd6b-99c5-40cd-a7e5-45d6aecb7780]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8adc727e-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590279, 'tstamp': 590279}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231117, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8adc727e-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590283, 'tstamp': 590283}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231117, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.780 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8adc727e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.789 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8adc727e-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.789 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.790 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8adc727e-00, col_values=(('external_ids', {'iface-id': '2f6f5e26-febd-486b-b124-db45f0c7630e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.790 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 06 14:38:07 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:07.792 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[0ee80a1d-ab35-41ab-9bf3-0a8382f47aa9]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8adc727e-02ae-4b04-987e-6e7497c7d5bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8adc727e-02ae-4b04-987e-6e7497c7d5bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.822 2 INFO nova.virt.libvirt.driver [-] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Instance destroyed successfully.
Oct 06 14:38:07 compute-0 nova_compute[192903]: 2025-10-06 14:38:07.822 2 DEBUG nova.objects.instance [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lazy-loading 'resources' on Instance uuid 05558b3e-f0ce-4e92-a78b-9c680daac7cb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.329 2 DEBUG nova.virt.libvirt.vif [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-06T14:37:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1865230288',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1865230288',id=37,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:37:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b34a3d22a4184efeac2c24f99e35e57b',ramdisk_id='',reservation_id='r-ohdzl21t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1465827508',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1465827508-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:37:23Z,user_data=None,user_id='4b10d6387fc3489689b6a36a963ab9f4',uuid=05558b3e-f0ce-4e92-a78b-9c680daac7cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.330 2 DEBUG nova.network.os_vif_util [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Converting VIF {"id": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "address": "fa:16:3e:bd:fd:c8", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfed6b36-c3", "ovs_interfaceid": "cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.330 2 DEBUG nova.network.os_vif_util [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfed6b36-c3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.330 2 DEBUG os_vif [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfed6b36-c3') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.332 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfed6b36-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.335 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e5f9c3d4-d42a-4dc9-af5a-aeaadbcf0a9b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.338 2 INFO os_vif [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:fd:c8,bridge_name='br-int',has_traffic_filtering=True,id=cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfed6b36-c3')
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.339 2 INFO nova.virt.libvirt.driver [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Deleting instance files /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb_del
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.339 2 INFO nova.virt.libvirt.driver [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Deletion of /var/lib/nova/instances/05558b3e-f0ce-4e92-a78b-9c680daac7cb_del complete
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.853 2 INFO nova.compute.manager [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.853 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.854 2 DEBUG nova.compute.manager [-] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.854 2 DEBUG nova.network.neutron [-] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:38:08 compute-0 nova_compute[192903]: 2025-10-06 14:38:08.854 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:38:09 compute-0 nova_compute[192903]: 2025-10-06 14:38:09.796 2 DEBUG nova.compute.manager [req-16c13481-374f-456c-959b-291b7fca7903 req-e0ac9f12-2fc4-4026-b233-c6a1c2705be1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Received event network-vif-unplugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:38:09 compute-0 nova_compute[192903]: 2025-10-06 14:38:09.796 2 DEBUG oslo_concurrency.lockutils [req-16c13481-374f-456c-959b-291b7fca7903 req-e0ac9f12-2fc4-4026-b233-c6a1c2705be1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:09 compute-0 nova_compute[192903]: 2025-10-06 14:38:09.797 2 DEBUG oslo_concurrency.lockutils [req-16c13481-374f-456c-959b-291b7fca7903 req-e0ac9f12-2fc4-4026-b233-c6a1c2705be1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:09 compute-0 nova_compute[192903]: 2025-10-06 14:38:09.797 2 DEBUG oslo_concurrency.lockutils [req-16c13481-374f-456c-959b-291b7fca7903 req-e0ac9f12-2fc4-4026-b233-c6a1c2705be1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:09 compute-0 nova_compute[192903]: 2025-10-06 14:38:09.797 2 DEBUG nova.compute.manager [req-16c13481-374f-456c-959b-291b7fca7903 req-e0ac9f12-2fc4-4026-b233-c6a1c2705be1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] No waiting events found dispatching network-vif-unplugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:38:09 compute-0 nova_compute[192903]: 2025-10-06 14:38:09.798 2 DEBUG nova.compute.manager [req-16c13481-374f-456c-959b-291b7fca7903 req-e0ac9f12-2fc4-4026-b233-c6a1c2705be1 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Received event network-vif-unplugged-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:38:09 compute-0 nova_compute[192903]: 2025-10-06 14:38:09.801 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:38:10 compute-0 nova_compute[192903]: 2025-10-06 14:38:10.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:11.423 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:11.423 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:11.424 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:11 compute-0 nova_compute[192903]: 2025-10-06 14:38:11.556 2 DEBUG nova.network.neutron [-] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:38:11 compute-0 nova_compute[192903]: 2025-10-06 14:38:11.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:38:11 compute-0 nova_compute[192903]: 2025-10-06 14:38:11.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:38:11 compute-0 nova_compute[192903]: 2025-10-06 14:38:11.880 2 DEBUG nova.compute.manager [req-872a03ec-8951-47f2-8a9a-58c7bf000888 req-a222cc50-e8a4-4684-99f3-b234dff743a3 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Received event network-vif-deleted-cfed6b36-c3f2-4b5d-9aef-2a970f8dcd6d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.066 2 INFO nova.compute.manager [-] [instance: 05558b3e-f0ce-4e92-a78b-9c680daac7cb] Took 3.21 seconds to deallocate network for instance.
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.096 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.097 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.097 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.584 2 DEBUG oslo_concurrency.lockutils [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.585 2 DEBUG oslo_concurrency.lockutils [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.614 2 DEBUG nova.scheduler.client.report [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.651 2 DEBUG nova.scheduler.client.report [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.651 2 DEBUG nova.compute.provider_tree [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.664 2 DEBUG nova.scheduler.client.report [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.689 2 DEBUG nova.scheduler.client.report [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 14:38:12 compute-0 nova_compute[192903]: 2025-10-06 14:38:12.740 2 DEBUG nova.compute.provider_tree [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.145 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.224 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.225 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.253 2 DEBUG nova.scheduler.client.report [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.292 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.438 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.439 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.468 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.469 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5690MB free_disk=73.27108764648438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.469 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.770 2 DEBUG oslo_concurrency.lockutils [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.185s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.774 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.306s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:13 compute-0 nova_compute[192903]: 2025-10-06 14:38:13.801 2 INFO nova.scheduler.client.report [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Deleted allocations for instance 05558b3e-f0ce-4e92-a78b-9c680daac7cb
Oct 06 14:38:14 compute-0 nova_compute[192903]: 2025-10-06 14:38:14.845 2 DEBUG oslo_concurrency.lockutils [None req-8f9ed995-83ab-4196-b8d7-5a5b71aef4ca 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "05558b3e-f0ce-4e92-a78b-9c680daac7cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.847s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:15 compute-0 podman[231143]: 2025-10-06 14:38:15.202953723 +0000 UTC m=+0.064278690 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930)
Oct 06 14:38:15 compute-0 podman[231144]: 2025-10-06 14:38:15.203050136 +0000 UTC m=+0.058541827 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:38:15 compute-0 podman[231142]: 2025-10-06 14:38:15.220845862 +0000 UTC m=+0.077135644 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 06 14:38:15 compute-0 podman[231141]: 2025-10-06 14:38:15.221359466 +0000 UTC m=+0.085846917 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.360 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Instance ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.361 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.361 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:38:13 up  1:39,  0 user,  load average: 0.22, 0.13, 0.19\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_b34a3d22a4184efeac2c24f99e35e57b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.399 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.666 2 DEBUG oslo_concurrency.lockutils [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.667 2 DEBUG oslo_concurrency.lockutils [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.667 2 DEBUG oslo_concurrency.lockutils [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.667 2 DEBUG oslo_concurrency.lockutils [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.668 2 DEBUG oslo_concurrency.lockutils [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.680 2 INFO nova.compute.manager [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Terminating instance
Oct 06 14:38:15 compute-0 nova_compute[192903]: 2025-10-06 14:38:15.911 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.199 2 DEBUG nova.compute.manager [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 06 14:38:16 compute-0 kernel: tap3f914008-9f (unregistering): left promiscuous mode
Oct 06 14:38:16 compute-0 NetworkManager[52035]: <info>  [1759761496.2272] device (tap3f914008-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 06 14:38:16 compute-0 ovn_controller[95205]: 2025-10-06T14:38:16Z|00316|binding|INFO|Releasing lport 3f914008-9fbc-428e-8aa8-f80f42de5f4f from this chassis (sb_readonly=0)
Oct 06 14:38:16 compute-0 ovn_controller[95205]: 2025-10-06T14:38:16Z|00317|binding|INFO|Setting lport 3f914008-9fbc-428e-8aa8-f80f42de5f4f down in Southbound
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:16 compute-0 ovn_controller[95205]: 2025-10-06T14:38:16Z|00318|binding|INFO|Removing iface tap3f914008-9f ovn-installed in OVS
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.246 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:f1:a6 10.100.0.4'], port_security=['fa:16:3e:db:f1:a6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b34a3d22a4184efeac2c24f99e35e57b', 'neutron:revision_number': '16', 'neutron:security_group_ids': '55c8d8e8-c6c8-4e9c-a831-7aa8bca32dc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33ee0085-afaf-49a4-a46e-ec822f622bc0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>], logical_port=3f914008-9fbc-428e-8aa8-f80f42de5f4f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7ff1029de540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.247 104072 INFO neutron.agent.ovn.metadata.agent [-] Port 3f914008-9fbc-428e-8aa8-f80f42de5f4f in datapath 8adc727e-02ae-4b04-987e-6e7497c7d5bb unbound from our chassis
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.249 104072 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8adc727e-02ae-4b04-987e-6e7497c7d5bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.249 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[a14a6412-8606-47f7-a53a-19f7493c822b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.250 104072 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb namespace which is not needed anymore
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:16 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000024.scope: Deactivated successfully.
Oct 06 14:38:16 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000024.scope: Consumed 3.428s CPU time.
Oct 06 14:38:16 compute-0 systemd-machined[152985]: Machine qemu-28-instance-00000024 terminated.
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.376 2 DEBUG nova.compute.manager [req-f4340582-85da-488e-91fb-2cf216aaa51a req-964e04c6-d295-445f-b1c4-d63456ab3eb0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Received event network-vif-unplugged-3f914008-9fbc-428e-8aa8-f80f42de5f4f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.377 2 DEBUG oslo_concurrency.lockutils [req-f4340582-85da-488e-91fb-2cf216aaa51a req-964e04c6-d295-445f-b1c4-d63456ab3eb0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.377 2 DEBUG oslo_concurrency.lockutils [req-f4340582-85da-488e-91fb-2cf216aaa51a req-964e04c6-d295-445f-b1c4-d63456ab3eb0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.378 2 DEBUG oslo_concurrency.lockutils [req-f4340582-85da-488e-91fb-2cf216aaa51a req-964e04c6-d295-445f-b1c4-d63456ab3eb0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.378 2 DEBUG nova.compute.manager [req-f4340582-85da-488e-91fb-2cf216aaa51a req-964e04c6-d295-445f-b1c4-d63456ab3eb0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] No waiting events found dispatching network-vif-unplugged-3f914008-9fbc-428e-8aa8-f80f42de5f4f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.379 2 DEBUG nova.compute.manager [req-f4340582-85da-488e-91fb-2cf216aaa51a req-964e04c6-d295-445f-b1c4-d63456ab3eb0 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Received event network-vif-unplugged-3f914008-9fbc-428e-8aa8-f80f42de5f4f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:38:16 compute-0 neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb[230803]: [NOTICE]   (230813) : haproxy version is 3.0.5-8e879a5
Oct 06 14:38:16 compute-0 podman[231248]: 2025-10-06 14:38:16.394503452 +0000 UTC m=+0.035592413 container kill 687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 06 14:38:16 compute-0 neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb[230803]: [NOTICE]   (230813) : path to executable is /usr/sbin/haproxy
Oct 06 14:38:16 compute-0 neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb[230803]: [WARNING]  (230813) : Exiting Master process...
Oct 06 14:38:16 compute-0 neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb[230803]: [ALERT]    (230813) : Current worker (230816) exited with code 143 (Terminated)
Oct 06 14:38:16 compute-0 neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb[230803]: [WARNING]  (230813) : All workers exited. Exiting... (0)
Oct 06 14:38:16 compute-0 systemd[1]: libpod-687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30.scope: Deactivated successfully.
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.419 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.420 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.645s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:16 compute-0 podman[231265]: 2025-10-06 14:38:16.441191081 +0000 UTC m=+0.025490993 container died 687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.460 2 INFO nova.virt.libvirt.driver [-] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Instance destroyed successfully.
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.461 2 DEBUG nova.objects.instance [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lazy-loading 'resources' on Instance uuid ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 06 14:38:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30-userdata-shm.mount: Deactivated successfully.
Oct 06 14:38:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-c46aa0621b18a5e65384ddd38e5645b271482372f9beaf098f0d3a84c96011b3-merged.mount: Deactivated successfully.
Oct 06 14:38:16 compute-0 podman[231265]: 2025-10-06 14:38:16.480006089 +0000 UTC m=+0.064306001 container cleanup 687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Oct 06 14:38:16 compute-0 systemd[1]: libpod-conmon-687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30.scope: Deactivated successfully.
Oct 06 14:38:16 compute-0 podman[231266]: 2025-10-06 14:38:16.497077326 +0000 UTC m=+0.075363507 container remove 687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.502 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[493de270-ea8f-4fdc-94e5-95c89f377403]: (4, ("Mon Oct  6 02:38:16 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb (687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30)\n687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30\nMon Oct  6 02:38:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb (687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30)\n687f614c4e0f3ad74e0688cf44f4c6fc9d9c5e51cbbc0c23f705820c8ac2ea30\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.504 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[6005b046-010f-4c28-b1f4-6f6aec03d0a3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.505 104072 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8adc727e-02ae-4b04-987e-6e7497c7d5bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.505 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8d544fcb-2126-4a04-98b5-32a0a5b40e6e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.506 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8adc727e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:16 compute-0 kernel: tap8adc727e-00: left promiscuous mode
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.525 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[06bf5707-7293-467c-bffe-52abc2a3c7fa]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.553 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[8f278a6e-52b6-4f0f-a8bf-621885294408]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.554 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[39a95906-85b3-48e7-99ad-542933c00478]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.570 214189 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c1f622-017c-409b-af88-c6b05be29d3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590256, 'reachable_time': 22749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231315, 'error': None, 'target': 'ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:16 compute-0 systemd[1]: run-netns-ovnmeta\x2d8adc727e\x2d02ae\x2d4b04\x2d987e\x2d6e7497c7d5bb.mount: Deactivated successfully.
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.574 104207 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8adc727e-02ae-4b04-987e-6e7497c7d5bb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 06 14:38:16 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:16.574 104207 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2b6149-e0f9-49e9-bbb7-a70831b0baca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.968 2 DEBUG nova.virt.libvirt.vif [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-06T14:36:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1568413757',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1568413757',id=36,image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-06T14:36:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b34a3d22a4184efeac2c24f99e35e57b',ramdisk_id='',reservation_id='r-4v1b19jk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',clean_attempts='1',image_base_image_ref='22f1b7c7-d15f-4caf-8898-de5e10b0ea89',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1465827508',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1465827508-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-06T14:38:03Z,user_data=None,user_id='4b10d6387fc3489689b6a36a963ab9f4',uuid=ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "address": "fa:16:3e:db:f1:a6", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f914008-9f", "ovs_interfaceid": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.969 2 DEBUG nova.network.os_vif_util [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Converting VIF {"id": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "address": "fa:16:3e:db:f1:a6", "network": {"id": "8adc727e-02ae-4b04-987e-6e7497c7d5bb", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-441679032-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e0cb46ea43d49c481c08810585a4a3f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f914008-9f", "ovs_interfaceid": "3f914008-9fbc-428e-8aa8-f80f42de5f4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.970 2 DEBUG nova.network.os_vif_util [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:f1:a6,bridge_name='br-int',has_traffic_filtering=True,id=3f914008-9fbc-428e-8aa8-f80f42de5f4f,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f914008-9f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.971 2 DEBUG os_vif [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:f1:a6,bridge_name='br-int',has_traffic_filtering=True,id=3f914008-9fbc-428e-8aa8-f80f42de5f4f,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f914008-9f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.974 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f914008-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.980 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=365b8723-fe55-408d-816a-4fa15438bfdb) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.986 2 INFO os_vif [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:f1:a6,bridge_name='br-int',has_traffic_filtering=True,id=3f914008-9fbc-428e-8aa8-f80f42de5f4f,network=Network(8adc727e-02ae-4b04-987e-6e7497c7d5bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f914008-9f')
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.987 2 INFO nova.virt.libvirt.driver [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Deleting instance files /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c_del
Oct 06 14:38:16 compute-0 nova_compute[192903]: 2025-10-06 14:38:16.988 2 INFO nova.virt.libvirt.driver [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Deletion of /var/lib/nova/instances/ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c_del complete
Oct 06 14:38:17 compute-0 nova_compute[192903]: 2025-10-06 14:38:17.415 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:38:17 compute-0 nova_compute[192903]: 2025-10-06 14:38:17.503 2 INFO nova.compute.manager [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Took 1.30 seconds to destroy the instance on the hypervisor.
Oct 06 14:38:17 compute-0 nova_compute[192903]: 2025-10-06 14:38:17.503 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 06 14:38:17 compute-0 nova_compute[192903]: 2025-10-06 14:38:17.504 2 DEBUG nova.compute.manager [-] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 06 14:38:17 compute-0 nova_compute[192903]: 2025-10-06 14:38:17.504 2 DEBUG nova.network.neutron [-] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 06 14:38:17 compute-0 nova_compute[192903]: 2025-10-06 14:38:17.504 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:38:17 compute-0 nova_compute[192903]: 2025-10-06 14:38:17.829 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 06 14:38:18 compute-0 nova_compute[192903]: 2025-10-06 14:38:18.614 2 DEBUG nova.network.neutron [-] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:38:19 compute-0 nova_compute[192903]: 2025-10-06 14:38:19.380 2 DEBUG nova.compute.manager [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Received event network-vif-unplugged-3f914008-9fbc-428e-8aa8-f80f42de5f4f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:38:19 compute-0 nova_compute[192903]: 2025-10-06 14:38:19.381 2 DEBUG oslo_concurrency.lockutils [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Acquiring lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:19 compute-0 nova_compute[192903]: 2025-10-06 14:38:19.381 2 DEBUG oslo_concurrency.lockutils [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:19 compute-0 nova_compute[192903]: 2025-10-06 14:38:19.382 2 DEBUG oslo_concurrency.lockutils [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] Lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:19 compute-0 nova_compute[192903]: 2025-10-06 14:38:19.382 2 DEBUG nova.compute.manager [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] No waiting events found dispatching network-vif-unplugged-3f914008-9fbc-428e-8aa8-f80f42de5f4f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 06 14:38:19 compute-0 nova_compute[192903]: 2025-10-06 14:38:19.383 2 DEBUG nova.compute.manager [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Received event network-vif-unplugged-3f914008-9fbc-428e-8aa8-f80f42de5f4f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 06 14:38:19 compute-0 nova_compute[192903]: 2025-10-06 14:38:19.383 2 DEBUG nova.compute.manager [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Received event network-vif-deleted-3f914008-9fbc-428e-8aa8-f80f42de5f4f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 06 14:38:19 compute-0 nova_compute[192903]: 2025-10-06 14:38:19.383 2 INFO nova.compute.manager [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Neutron deleted interface 3f914008-9fbc-428e-8aa8-f80f42de5f4f; detaching it from the instance and deleting it from the info cache
Oct 06 14:38:19 compute-0 nova_compute[192903]: 2025-10-06 14:38:19.384 2 DEBUG nova.network.neutron [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 06 14:38:20 compute-0 nova_compute[192903]: 2025-10-06 14:38:20.031 2 INFO nova.compute.manager [-] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Took 2.53 seconds to deallocate network for instance.
Oct 06 14:38:20 compute-0 nova_compute[192903]: 2025-10-06 14:38:20.040 2 DEBUG nova.compute.manager [req-585e64d8-ef17-4f5e-906d-3e7f747c2b21 req-de6473b5-60f4-4aa6-a740-46d3d1697b78 e8f67e9c58134994beb18b3f63b5cce6 69cf06613b5f44abb7aa9d7e505adede - - default default] [instance: ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c] Detach interface failed, port_id=3f914008-9fbc-428e-8aa8-f80f42de5f4f, reason: Instance ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 06 14:38:20 compute-0 nova_compute[192903]: 2025-10-06 14:38:20.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:20 compute-0 nova_compute[192903]: 2025-10-06 14:38:20.562 2 DEBUG oslo_concurrency.lockutils [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:38:20 compute-0 nova_compute[192903]: 2025-10-06 14:38:20.563 2 DEBUG oslo_concurrency.lockutils [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:38:20 compute-0 nova_compute[192903]: 2025-10-06 14:38:20.615 2 DEBUG nova.compute.provider_tree [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:38:21 compute-0 nova_compute[192903]: 2025-10-06 14:38:21.123 2 DEBUG nova.scheduler.client.report [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:38:21 compute-0 nova_compute[192903]: 2025-10-06 14:38:21.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:38:21 compute-0 nova_compute[192903]: 2025-10-06 14:38:21.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:38:21 compute-0 nova_compute[192903]: 2025-10-06 14:38:21.635 2 DEBUG oslo_concurrency.lockutils [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.072s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:21 compute-0 nova_compute[192903]: 2025-10-06 14:38:21.662 2 INFO nova.scheduler.client.report [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Deleted allocations for instance ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c
Oct 06 14:38:21 compute-0 nova_compute[192903]: 2025-10-06 14:38:21.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:22 compute-0 nova_compute[192903]: 2025-10-06 14:38:22.688 2 DEBUG oslo_concurrency.lockutils [None req-da2fe1db-5699-41c9-868e-449c83b3434a 4b10d6387fc3489689b6a36a963ab9f4 b34a3d22a4184efeac2c24f99e35e57b - - default default] Lock "ffcedb11-18b4-4b8a-83d9-81ea4b5bd03c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.022s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:38:23 compute-0 nova_compute[192903]: 2025-10-06 14:38:23.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:38:24 compute-0 nova_compute[192903]: 2025-10-06 14:38:24.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:38:25 compute-0 nova_compute[192903]: 2025-10-06 14:38:25.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:25 compute-0 nova_compute[192903]: 2025-10-06 14:38:25.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:38:26 compute-0 podman[231316]: 2025-10-06 14:38:26.209895203 +0000 UTC m=+0.066496779 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest)
Oct 06 14:38:26 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:26.837 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:38:26 compute-0 nova_compute[192903]: 2025-10-06 14:38:26.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:26 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:26.839 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:38:26 compute-0 nova_compute[192903]: 2025-10-06 14:38:26.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:27 compute-0 nova_compute[192903]: 2025-10-06 14:38:27.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:29 compute-0 podman[203308]: time="2025-10-06T14:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:38:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:38:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 06 14:38:30 compute-0 podman[231338]: 2025-10-06 14:38:30.229515787 +0000 UTC m=+0.083876744 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.expose-services=, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Oct 06 14:38:30 compute-0 nova_compute[192903]: 2025-10-06 14:38:30.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:30 compute-0 nova_compute[192903]: 2025-10-06 14:38:30.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:38:31 compute-0 openstack_network_exporter[205500]: ERROR   14:38:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:38:31 compute-0 openstack_network_exporter[205500]: ERROR   14:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:38:31 compute-0 openstack_network_exporter[205500]: ERROR   14:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:38:31 compute-0 openstack_network_exporter[205500]: ERROR   14:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:38:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:38:31 compute-0 openstack_network_exporter[205500]: ERROR   14:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:38:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:38:31 compute-0 nova_compute[192903]: 2025-10-06 14:38:31.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:33 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:38:33.840 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:38:35 compute-0 nova_compute[192903]: 2025-10-06 14:38:35.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:36 compute-0 nova_compute[192903]: 2025-10-06 14:38:36.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:40 compute-0 nova_compute[192903]: 2025-10-06 14:38:40.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:41 compute-0 nova_compute[192903]: 2025-10-06 14:38:41.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:45 compute-0 nova_compute[192903]: 2025-10-06 14:38:45.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:46 compute-0 podman[231363]: 2025-10-06 14:38:46.216056043 +0000 UTC m=+0.075233854 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Oct 06 14:38:46 compute-0 podman[231364]: 2025-10-06 14:38:46.230452898 +0000 UTC m=+0.076972090 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 06 14:38:46 compute-0 podman[231371]: 2025-10-06 14:38:46.239179381 +0000 UTC m=+0.079357884 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:38:46 compute-0 podman[231362]: 2025-10-06 14:38:46.251027698 +0000 UTC m=+0.116671742 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:38:46 compute-0 nova_compute[192903]: 2025-10-06 14:38:46.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:50 compute-0 nova_compute[192903]: 2025-10-06 14:38:50.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:51 compute-0 nova_compute[192903]: 2025-10-06 14:38:51.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:55 compute-0 nova_compute[192903]: 2025-10-06 14:38:55.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:56 compute-0 nova_compute[192903]: 2025-10-06 14:38:56.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:38:57 compute-0 podman[231449]: 2025-10-06 14:38:57.214458379 +0000 UTC m=+0.078305693 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:38:59 compute-0 podman[203308]: time="2025-10-06T14:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:38:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:38:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Oct 06 14:39:00 compute-0 nova_compute[192903]: 2025-10-06 14:39:00.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:01 compute-0 podman[231469]: 2025-10-06 14:39:01.185768413 +0000 UTC m=+0.054604870 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 06 14:39:01 compute-0 openstack_network_exporter[205500]: ERROR   14:39:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:39:01 compute-0 openstack_network_exporter[205500]: ERROR   14:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:39:01 compute-0 openstack_network_exporter[205500]: ERROR   14:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:39:01 compute-0 openstack_network_exporter[205500]: ERROR   14:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:39:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:39:01 compute-0 openstack_network_exporter[205500]: ERROR   14:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:39:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:39:01 compute-0 nova_compute[192903]: 2025-10-06 14:39:01.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:03 compute-0 ovn_controller[95205]: 2025-10-06T14:39:03Z|00319|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 06 14:39:05 compute-0 nova_compute[192903]: 2025-10-06 14:39:05.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:07 compute-0 nova_compute[192903]: 2025-10-06 14:39:07.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:10 compute-0 nova_compute[192903]: 2025-10-06 14:39:10.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:39:11.425 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:39:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:39:11.426 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:39:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:39:11.426 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:39:11 compute-0 nova_compute[192903]: 2025-10-06 14:39:11.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:11 compute-0 nova_compute[192903]: 2025-10-06 14:39:11.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.100 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.100 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.100 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.323 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.325 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.367 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.368 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5865MB free_disk=73.2921257019043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.368 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:39:12 compute-0 nova_compute[192903]: 2025-10-06 14:39:12.368 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:39:13 compute-0 nova_compute[192903]: 2025-10-06 14:39:13.425 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:39:13 compute-0 nova_compute[192903]: 2025-10-06 14:39:13.425 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:39:12 up  1:40,  0 user,  load average: 0.08, 0.11, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:39:13 compute-0 nova_compute[192903]: 2025-10-06 14:39:13.449 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:39:13 compute-0 nova_compute[192903]: 2025-10-06 14:39:13.955 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:39:14 compute-0 nova_compute[192903]: 2025-10-06 14:39:14.483 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:39:14 compute-0 nova_compute[192903]: 2025-10-06 14:39:14.484 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:39:14 compute-0 nova_compute[192903]: 2025-10-06 14:39:14.484 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:14 compute-0 nova_compute[192903]: 2025-10-06 14:39:14.485 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 14:39:14 compute-0 nova_compute[192903]: 2025-10-06 14:39:14.994 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 14:39:15 compute-0 nova_compute[192903]: 2025-10-06 14:39:15.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:17 compute-0 nova_compute[192903]: 2025-10-06 14:39:17.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:17 compute-0 podman[231495]: 2025-10-06 14:39:17.231460198 +0000 UTC m=+0.083153973 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930)
Oct 06 14:39:17 compute-0 podman[231494]: 2025-10-06 14:39:17.239989206 +0000 UTC m=+0.092774320 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:39:17 compute-0 podman[231496]: 2025-10-06 14:39:17.246624113 +0000 UTC m=+0.087220622 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:39:17 compute-0 podman[231493]: 2025-10-06 14:39:17.26821963 +0000 UTC m=+0.130155579 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 14:39:17 compute-0 nova_compute[192903]: 2025-10-06 14:39:17.990 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:20 compute-0 nova_compute[192903]: 2025-10-06 14:39:20.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:21 compute-0 nova_compute[192903]: 2025-10-06 14:39:21.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:21 compute-0 nova_compute[192903]: 2025-10-06 14:39:21.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:39:22 compute-0 nova_compute[192903]: 2025-10-06 14:39:22.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:23 compute-0 sshd-session[231576]: Accepted publickey for zuul from 192.168.122.10 port 37946 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 14:39:23 compute-0 systemd-logind[789]: New session 35 of user zuul.
Oct 06 14:39:23 compute-0 systemd[1]: Started Session 35 of User zuul.
Oct 06 14:39:23 compute-0 sshd-session[231576]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 14:39:23 compute-0 sudo[231580]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 06 14:39:23 compute-0 sudo[231580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 14:39:23 compute-0 nova_compute[192903]: 2025-10-06 14:39:23.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:24 compute-0 nova_compute[192903]: 2025-10-06 14:39:24.576 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:25 compute-0 nova_compute[192903]: 2025-10-06 14:39:25.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:26 compute-0 nova_compute[192903]: 2025-10-06 14:39:26.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:27 compute-0 nova_compute[192903]: 2025-10-06 14:39:27.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:27 compute-0 nova_compute[192903]: 2025-10-06 14:39:27.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:27 compute-0 ovs-vsctl[231748]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 06 14:39:28 compute-0 podman[231784]: 2025-10-06 14:39:28.20898449 +0000 UTC m=+0.070279149 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4)
Oct 06 14:39:28 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 231604 (sos)
Oct 06 14:39:28 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct 06 14:39:28 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct 06 14:39:28 compute-0 virtqemud[192802]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 06 14:39:28 compute-0 virtqemud[192802]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 06 14:39:28 compute-0 virtqemud[192802]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 06 14:39:29 compute-0 kernel: block vda: the capability attribute has been deprecated.
Oct 06 14:39:29 compute-0 podman[203308]: time="2025-10-06T14:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:39:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:39:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3023 "" "Go-http-client/1.1"
Oct 06 14:39:29 compute-0 crontab[232197]: (root) LIST (root)
Oct 06 14:39:30 compute-0 nova_compute[192903]: 2025-10-06 14:39:30.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:31 compute-0 openstack_network_exporter[205500]: ERROR   14:39:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:39:31 compute-0 openstack_network_exporter[205500]: ERROR   14:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:39:31 compute-0 openstack_network_exporter[205500]: ERROR   14:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:39:31 compute-0 openstack_network_exporter[205500]: ERROR   14:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:39:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:39:31 compute-0 openstack_network_exporter[205500]: ERROR   14:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:39:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:39:31 compute-0 nova_compute[192903]: 2025-10-06 14:39:31.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:32 compute-0 nova_compute[192903]: 2025-10-06 14:39:32.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:32 compute-0 podman[232291]: 2025-10-06 14:39:32.212152615 +0000 UTC m=+0.067921646 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 06 14:39:32 compute-0 systemd[1]: Starting Hostname Service...
Oct 06 14:39:32 compute-0 systemd[1]: Started Hostname Service.
Oct 06 14:39:35 compute-0 nova_compute[192903]: 2025-10-06 14:39:35.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:37 compute-0 nova_compute[192903]: 2025-10-06 14:39:37.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:37 compute-0 nova_compute[192903]: 2025-10-06 14:39:37.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:38 compute-0 ovs-appctl[233328]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 06 14:39:38 compute-0 ovs-appctl[233339]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 06 14:39:38 compute-0 ovs-appctl[233348]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 06 14:39:40 compute-0 nova_compute[192903]: 2025-10-06 14:39:40.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:42 compute-0 nova_compute[192903]: 2025-10-06 14:39:42.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:45 compute-0 nova_compute[192903]: 2025-10-06 14:39:45.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:45 compute-0 virtqemud[192802]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 06 14:39:47 compute-0 nova_compute[192903]: 2025-10-06 14:39:47.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:47 compute-0 podman[234732]: 2025-10-06 14:39:47.345379578 +0000 UTC m=+0.066266612 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Oct 06 14:39:47 compute-0 podman[234729]: 2025-10-06 14:39:47.368401453 +0000 UTC m=+0.092580685 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 06 14:39:47 compute-0 podman[234733]: 2025-10-06 14:39:47.384306338 +0000 UTC m=+0.097216878 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:39:47 compute-0 podman[234746]: 2025-10-06 14:39:47.426823534 +0000 UTC m=+0.112321102 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 06 14:39:48 compute-0 systemd[1]: Starting Time & Date Service...
Oct 06 14:39:48 compute-0 systemd[1]: Started Time & Date Service.
Oct 06 14:39:50 compute-0 nova_compute[192903]: 2025-10-06 14:39:50.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:51 compute-0 nova_compute[192903]: 2025-10-06 14:39:51.161 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:52 compute-0 nova_compute[192903]: 2025-10-06 14:39:52.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:54 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 06 14:39:55 compute-0 nova_compute[192903]: 2025-10-06 14:39:55.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:39:55 compute-0 nova_compute[192903]: 2025-10-06 14:39:55.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 14:39:55 compute-0 nova_compute[192903]: 2025-10-06 14:39:55.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:57 compute-0 nova_compute[192903]: 2025-10-06 14:39:57.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:39:58 compute-0 podman[234845]: 2025-10-06 14:39:58.736738268 +0000 UTC m=+0.119994688 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible)
Oct 06 14:39:59 compute-0 podman[203308]: time="2025-10-06T14:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:39:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:39:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Oct 06 14:40:00 compute-0 nova_compute[192903]: 2025-10-06 14:40:00.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:01 compute-0 openstack_network_exporter[205500]: ERROR   14:40:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:40:01 compute-0 openstack_network_exporter[205500]: ERROR   14:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:40:01 compute-0 openstack_network_exporter[205500]: ERROR   14:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:40:01 compute-0 openstack_network_exporter[205500]: ERROR   14:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:40:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:40:01 compute-0 openstack_network_exporter[205500]: ERROR   14:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:40:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:40:02 compute-0 nova_compute[192903]: 2025-10-06 14:40:02.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:02 compute-0 podman[234867]: 2025-10-06 14:40:02.897753591 +0000 UTC m=+0.076926466 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 06 14:40:05 compute-0 nova_compute[192903]: 2025-10-06 14:40:05.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:07 compute-0 nova_compute[192903]: 2025-10-06 14:40:07.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:07 compute-0 sudo[231580]: pam_unix(sudo:session): session closed for user root
Oct 06 14:40:07 compute-0 sshd-session[231579]: Received disconnect from 192.168.122.10 port 37946:11: disconnected by user
Oct 06 14:40:07 compute-0 sshd-session[231579]: Disconnected from user zuul 192.168.122.10 port 37946
Oct 06 14:40:07 compute-0 sshd-session[231576]: pam_unix(sshd:session): session closed for user zuul
Oct 06 14:40:07 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Oct 06 14:40:07 compute-0 systemd[1]: session-35.scope: Consumed 1min 14.015s CPU time, 507.7M memory peak, read 107.0M from disk, written 18.3M to disk.
Oct 06 14:40:07 compute-0 systemd-logind[789]: Session 35 logged out. Waiting for processes to exit.
Oct 06 14:40:07 compute-0 systemd-logind[789]: Removed session 35.
Oct 06 14:40:08 compute-0 sshd-session[234886]: Accepted publickey for zuul from 192.168.122.10 port 54648 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 14:40:08 compute-0 systemd-logind[789]: New session 36 of user zuul.
Oct 06 14:40:08 compute-0 systemd[1]: Started Session 36 of User zuul.
Oct 06 14:40:08 compute-0 sshd-session[234886]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 14:40:08 compute-0 sudo[234890]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-10-06-umhmrjy.tar.xz
Oct 06 14:40:08 compute-0 sudo[234890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 14:40:08 compute-0 sudo[234890]: pam_unix(sudo:session): session closed for user root
Oct 06 14:40:08 compute-0 sshd-session[234889]: Received disconnect from 192.168.122.10 port 54648:11: disconnected by user
Oct 06 14:40:08 compute-0 sshd-session[234889]: Disconnected from user zuul 192.168.122.10 port 54648
Oct 06 14:40:08 compute-0 sshd-session[234886]: pam_unix(sshd:session): session closed for user zuul
Oct 06 14:40:08 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Oct 06 14:40:08 compute-0 systemd-logind[789]: Session 36 logged out. Waiting for processes to exit.
Oct 06 14:40:08 compute-0 systemd-logind[789]: Removed session 36.
Oct 06 14:40:08 compute-0 sshd-session[234915]: Accepted publickey for zuul from 192.168.122.10 port 54662 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 14:40:08 compute-0 systemd-logind[789]: New session 37 of user zuul.
Oct 06 14:40:08 compute-0 systemd[1]: Started Session 37 of User zuul.
Oct 06 14:40:08 compute-0 sshd-session[234915]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 14:40:08 compute-0 sudo[234919]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Oct 06 14:40:08 compute-0 sudo[234919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 14:40:08 compute-0 sudo[234919]: pam_unix(sudo:session): session closed for user root
Oct 06 14:40:08 compute-0 sshd-session[234918]: Received disconnect from 192.168.122.10 port 54662:11: disconnected by user
Oct 06 14:40:08 compute-0 sshd-session[234918]: Disconnected from user zuul 192.168.122.10 port 54662
Oct 06 14:40:08 compute-0 sshd-session[234915]: pam_unix(sshd:session): session closed for user zuul
Oct 06 14:40:08 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Oct 06 14:40:08 compute-0 systemd-logind[789]: Session 37 logged out. Waiting for processes to exit.
Oct 06 14:40:08 compute-0 systemd-logind[789]: Removed session 37.
Oct 06 14:40:10 compute-0 nova_compute[192903]: 2025-10-06 14:40:10.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:40:11.428 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:40:11.429 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:40:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:40:11.429 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.096 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.624 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.625 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.625 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.626 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.819 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.821 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.865 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.866 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5642MB free_disk=73.29157638549805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.866 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:40:12 compute-0 nova_compute[192903]: 2025-10-06 14:40:12.866 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:40:13 compute-0 nova_compute[192903]: 2025-10-06 14:40:13.964 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:40:13 compute-0 nova_compute[192903]: 2025-10-06 14:40:13.964 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:40:12 up  1:41,  0 user,  load average: 1.07, 0.40, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:40:13 compute-0 nova_compute[192903]: 2025-10-06 14:40:13.997 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:40:14 compute-0 nova_compute[192903]: 2025-10-06 14:40:14.533 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:40:15 compute-0 nova_compute[192903]: 2025-10-06 14:40:15.144 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:40:15 compute-0 nova_compute[192903]: 2025-10-06 14:40:15.144 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.278s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:40:15 compute-0 nova_compute[192903]: 2025-10-06 14:40:15.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:15 compute-0 nova_compute[192903]: 2025-10-06 14:40:15.630 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:40:15 compute-0 nova_compute[192903]: 2025-10-06 14:40:15.630 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:40:17 compute-0 nova_compute[192903]: 2025-10-06 14:40:17.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:18 compute-0 podman[234949]: 2025-10-06 14:40:18.230115216 +0000 UTC m=+0.069337314 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:40:18 compute-0 podman[234947]: 2025-10-06 14:40:18.231630376 +0000 UTC m=+0.083553723 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Oct 06 14:40:18 compute-0 podman[234948]: 2025-10-06 14:40:18.246765801 +0000 UTC m=+0.101694999 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 06 14:40:18 compute-0 podman[234946]: 2025-10-06 14:40:18.258906675 +0000 UTC m=+0.114634584 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 06 14:40:18 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 06 14:40:18 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 06 14:40:20 compute-0 nova_compute[192903]: 2025-10-06 14:40:20.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:22 compute-0 nova_compute[192903]: 2025-10-06 14:40:22.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:22 compute-0 nova_compute[192903]: 2025-10-06 14:40:22.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:40:22 compute-0 nova_compute[192903]: 2025-10-06 14:40:22.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:40:25 compute-0 nova_compute[192903]: 2025-10-06 14:40:25.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:40:25 compute-0 nova_compute[192903]: 2025-10-06 14:40:25.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:27 compute-0 nova_compute[192903]: 2025-10-06 14:40:27.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:27 compute-0 nova_compute[192903]: 2025-10-06 14:40:27.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:40:29 compute-0 podman[235031]: 2025-10-06 14:40:29.180115442 +0000 UTC m=+0.052740279 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:40:29 compute-0 nova_compute[192903]: 2025-10-06 14:40:29.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:40:29 compute-0 podman[203308]: time="2025-10-06T14:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:40:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:40:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 06 14:40:30 compute-0 nova_compute[192903]: 2025-10-06 14:40:30.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:31 compute-0 openstack_network_exporter[205500]: ERROR   14:40:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:40:31 compute-0 openstack_network_exporter[205500]: ERROR   14:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:40:31 compute-0 openstack_network_exporter[205500]: ERROR   14:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:40:31 compute-0 openstack_network_exporter[205500]: ERROR   14:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:40:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:40:31 compute-0 openstack_network_exporter[205500]: ERROR   14:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:40:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:40:32 compute-0 nova_compute[192903]: 2025-10-06 14:40:32.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:33 compute-0 podman[235052]: 2025-10-06 14:40:33.228311261 +0000 UTC m=+0.091061974 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 06 14:40:33 compute-0 nova_compute[192903]: 2025-10-06 14:40:33.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:40:35 compute-0 nova_compute[192903]: 2025-10-06 14:40:35.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:37 compute-0 nova_compute[192903]: 2025-10-06 14:40:37.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:40 compute-0 nova_compute[192903]: 2025-10-06 14:40:40.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:42 compute-0 nova_compute[192903]: 2025-10-06 14:40:42.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:45 compute-0 nova_compute[192903]: 2025-10-06 14:40:45.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:47 compute-0 nova_compute[192903]: 2025-10-06 14:40:47.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:49 compute-0 podman[235075]: 2025-10-06 14:40:49.229418774 +0000 UTC m=+0.083521733 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 06 14:40:49 compute-0 podman[235074]: 2025-10-06 14:40:49.240942502 +0000 UTC m=+0.092631806 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:40:49 compute-0 podman[235076]: 2025-10-06 14:40:49.242297068 +0000 UTC m=+0.077216754 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:40:49 compute-0 podman[235073]: 2025-10-06 14:40:49.263916196 +0000 UTC m=+0.122882955 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 06 14:40:50 compute-0 nova_compute[192903]: 2025-10-06 14:40:50.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:52 compute-0 nova_compute[192903]: 2025-10-06 14:40:52.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:55 compute-0 nova_compute[192903]: 2025-10-06 14:40:55.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:57 compute-0 nova_compute[192903]: 2025-10-06 14:40:57.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:40:59 compute-0 podman[203308]: time="2025-10-06T14:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:40:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:40:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Oct 06 14:41:00 compute-0 podman[235157]: 2025-10-06 14:41:00.193668721 +0000 UTC m=+0.056169982 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid)
Oct 06 14:41:00 compute-0 nova_compute[192903]: 2025-10-06 14:41:00.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:01 compute-0 openstack_network_exporter[205500]: ERROR   14:41:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:41:01 compute-0 openstack_network_exporter[205500]: ERROR   14:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:41:01 compute-0 openstack_network_exporter[205500]: ERROR   14:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:41:01 compute-0 openstack_network_exporter[205500]: ERROR   14:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:41:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:41:01 compute-0 openstack_network_exporter[205500]: ERROR   14:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:41:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:41:02 compute-0 nova_compute[192903]: 2025-10-06 14:41:02.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:04 compute-0 podman[235177]: 2025-10-06 14:41:04.209597098 +0000 UTC m=+0.067832973 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Oct 06 14:41:05 compute-0 nova_compute[192903]: 2025-10-06 14:41:05.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:07 compute-0 nova_compute[192903]: 2025-10-06 14:41:07.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:10 compute-0 nova_compute[192903]: 2025-10-06 14:41:10.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:41:11.431 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:41:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:41:11.432 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:41:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:41:11.432 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:41:12 compute-0 nova_compute[192903]: 2025-10-06 14:41:12.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:12 compute-0 nova_compute[192903]: 2025-10-06 14:41:12.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:41:13 compute-0 nova_compute[192903]: 2025-10-06 14:41:13.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.098 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.099 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.237 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.238 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.254 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.255 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5809MB free_disk=73.29301071166992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.256 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:41:14 compute-0 nova_compute[192903]: 2025-10-06 14:41:14.256 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:41:15 compute-0 nova_compute[192903]: 2025-10-06 14:41:15.412 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:41:15 compute-0 nova_compute[192903]: 2025-10-06 14:41:15.413 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:41:14 up  1:42,  0 user,  load average: 0.39, 0.33, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:41:15 compute-0 nova_compute[192903]: 2025-10-06 14:41:15.561 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:41:15 compute-0 nova_compute[192903]: 2025-10-06 14:41:15.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:16 compute-0 nova_compute[192903]: 2025-10-06 14:41:16.069 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:41:16 compute-0 nova_compute[192903]: 2025-10-06 14:41:16.584 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:41:16 compute-0 nova_compute[192903]: 2025-10-06 14:41:16.584 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.328s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:41:17 compute-0 nova_compute[192903]: 2025-10-06 14:41:17.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:18 compute-0 nova_compute[192903]: 2025-10-06 14:41:18.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:41:20 compute-0 podman[235208]: 2025-10-06 14:41:20.193262532 +0000 UTC m=+0.053025237 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:41:20 compute-0 podman[235201]: 2025-10-06 14:41:20.216891744 +0000 UTC m=+0.073128355 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:41:20 compute-0 podman[235202]: 2025-10-06 14:41:20.218729733 +0000 UTC m=+0.069205040 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:41:20 compute-0 podman[235200]: 2025-10-06 14:41:20.221295401 +0000 UTC m=+0.091592428 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 06 14:41:20 compute-0 nova_compute[192903]: 2025-10-06 14:41:20.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:22 compute-0 nova_compute[192903]: 2025-10-06 14:41:22.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:24 compute-0 nova_compute[192903]: 2025-10-06 14:41:24.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:41:24 compute-0 nova_compute[192903]: 2025-10-06 14:41:24.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:41:25 compute-0 nova_compute[192903]: 2025-10-06 14:41:25.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:41:25 compute-0 nova_compute[192903]: 2025-10-06 14:41:25.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:27 compute-0 nova_compute[192903]: 2025-10-06 14:41:27.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:27 compute-0 nova_compute[192903]: 2025-10-06 14:41:27.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:41:28 compute-0 nova_compute[192903]: 2025-10-06 14:41:28.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:41:29 compute-0 podman[203308]: time="2025-10-06T14:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:41:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:41:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 06 14:41:30 compute-0 nova_compute[192903]: 2025-10-06 14:41:30.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:31 compute-0 podman[235286]: 2025-10-06 14:41:31.216097824 +0000 UTC m=+0.073477585 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:41:31 compute-0 openstack_network_exporter[205500]: ERROR   14:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:41:31 compute-0 openstack_network_exporter[205500]: ERROR   14:41:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:41:31 compute-0 openstack_network_exporter[205500]: ERROR   14:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:41:31 compute-0 openstack_network_exporter[205500]: ERROR   14:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:41:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:41:31 compute-0 openstack_network_exporter[205500]: ERROR   14:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:41:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:41:31 compute-0 nova_compute[192903]: 2025-10-06 14:41:31.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:41:32 compute-0 nova_compute[192903]: 2025-10-06 14:41:32.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:33 compute-0 nova_compute[192903]: 2025-10-06 14:41:33.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:41:35 compute-0 podman[235306]: 2025-10-06 14:41:35.235277697 +0000 UTC m=+0.089132931 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Oct 06 14:41:35 compute-0 nova_compute[192903]: 2025-10-06 14:41:35.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:37 compute-0 nova_compute[192903]: 2025-10-06 14:41:37.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:40 compute-0 nova_compute[192903]: 2025-10-06 14:41:40.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:42 compute-0 nova_compute[192903]: 2025-10-06 14:41:42.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:45 compute-0 nova_compute[192903]: 2025-10-06 14:41:45.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:47 compute-0 nova_compute[192903]: 2025-10-06 14:41:47.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:50 compute-0 nova_compute[192903]: 2025-10-06 14:41:50.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:51 compute-0 podman[235330]: 2025-10-06 14:41:51.229564578 +0000 UTC m=+0.074195794 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:41:51 compute-0 podman[235327]: 2025-10-06 14:41:51.259538559 +0000 UTC m=+0.112343973 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:41:51 compute-0 podman[235329]: 2025-10-06 14:41:51.260659819 +0000 UTC m=+0.103049075 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 06 14:41:51 compute-0 podman[235328]: 2025-10-06 14:41:51.26483751 +0000 UTC m=+0.112549728 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:41:52 compute-0 nova_compute[192903]: 2025-10-06 14:41:52.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:55 compute-0 nova_compute[192903]: 2025-10-06 14:41:55.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:57 compute-0 nova_compute[192903]: 2025-10-06 14:41:57.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:41:59 compute-0 podman[203308]: time="2025-10-06T14:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:41:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:41:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 06 14:42:00 compute-0 nova_compute[192903]: 2025-10-06 14:42:00.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:01 compute-0 openstack_network_exporter[205500]: ERROR   14:42:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:42:01 compute-0 openstack_network_exporter[205500]: ERROR   14:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:42:01 compute-0 openstack_network_exporter[205500]: ERROR   14:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:42:01 compute-0 openstack_network_exporter[205500]: ERROR   14:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:42:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:42:01 compute-0 openstack_network_exporter[205500]: ERROR   14:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:42:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:42:02 compute-0 nova_compute[192903]: 2025-10-06 14:42:02.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:02 compute-0 podman[235410]: 2025-10-06 14:42:02.197523392 +0000 UTC m=+0.064084913 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 06 14:42:05 compute-0 nova_compute[192903]: 2025-10-06 14:42:05.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:06 compute-0 podman[235431]: 2025-10-06 14:42:06.236479073 +0000 UTC m=+0.094875836 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=)
Oct 06 14:42:07 compute-0 nova_compute[192903]: 2025-10-06 14:42:07.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:10 compute-0 nova_compute[192903]: 2025-10-06 14:42:10.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:42:11.433 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:42:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:42:11.433 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:42:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:42:11.433 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:42:12 compute-0 nova_compute[192903]: 2025-10-06 14:42:12.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:12 compute-0 nova_compute[192903]: 2025-10-06 14:42:12.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:42:15 compute-0 nova_compute[192903]: 2025-10-06 14:42:15.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:42:15 compute-0 nova_compute[192903]: 2025-10-06 14:42:15.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.100 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.100 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.100 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.318 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.319 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.343 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.344 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=73.29301071166992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.345 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:42:16 compute-0 nova_compute[192903]: 2025-10-06 14:42:16.345 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:42:17 compute-0 nova_compute[192903]: 2025-10-06 14:42:17.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:17 compute-0 nova_compute[192903]: 2025-10-06 14:42:17.505 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:42:17 compute-0 nova_compute[192903]: 2025-10-06 14:42:17.506 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:42:16 up  1:43,  0 user,  load average: 0.13, 0.26, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:42:17 compute-0 nova_compute[192903]: 2025-10-06 14:42:17.534 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:42:18 compute-0 nova_compute[192903]: 2025-10-06 14:42:18.041 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:42:18 compute-0 nova_compute[192903]: 2025-10-06 14:42:18.549 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:42:18 compute-0 nova_compute[192903]: 2025-10-06 14:42:18.549 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.204s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:42:20 compute-0 nova_compute[192903]: 2025-10-06 14:42:20.546 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:42:20 compute-0 nova_compute[192903]: 2025-10-06 14:42:20.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:22 compute-0 nova_compute[192903]: 2025-10-06 14:42:22.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:22 compute-0 podman[235455]: 2025-10-06 14:42:22.230539189 +0000 UTC m=+0.080939674 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:42:22 compute-0 podman[235454]: 2025-10-06 14:42:22.271282348 +0000 UTC m=+0.121094537 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Oct 06 14:42:22 compute-0 podman[235456]: 2025-10-06 14:42:22.274850233 +0000 UTC m=+0.113911625 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:42:22 compute-0 podman[235457]: 2025-10-06 14:42:22.287191583 +0000 UTC m=+0.118310183 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:42:24 compute-0 nova_compute[192903]: 2025-10-06 14:42:24.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:42:24 compute-0 nova_compute[192903]: 2025-10-06 14:42:24.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:42:25 compute-0 nova_compute[192903]: 2025-10-06 14:42:25.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:27 compute-0 nova_compute[192903]: 2025-10-06 14:42:27.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:27 compute-0 nova_compute[192903]: 2025-10-06 14:42:27.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:42:29 compute-0 podman[203308]: time="2025-10-06T14:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:42:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:42:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Oct 06 14:42:30 compute-0 nova_compute[192903]: 2025-10-06 14:42:30.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:42:30 compute-0 nova_compute[192903]: 2025-10-06 14:42:30.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:31 compute-0 openstack_network_exporter[205500]: ERROR   14:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:42:31 compute-0 openstack_network_exporter[205500]: ERROR   14:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:42:31 compute-0 openstack_network_exporter[205500]: ERROR   14:42:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:42:31 compute-0 openstack_network_exporter[205500]: ERROR   14:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:42:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:42:31 compute-0 openstack_network_exporter[205500]: ERROR   14:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:42:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:42:32 compute-0 nova_compute[192903]: 2025-10-06 14:42:32.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:32 compute-0 nova_compute[192903]: 2025-10-06 14:42:32.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:42:33 compute-0 podman[235538]: 2025-10-06 14:42:33.238903094 +0000 UTC m=+0.097016343 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 06 14:42:35 compute-0 nova_compute[192903]: 2025-10-06 14:42:35.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:42:35 compute-0 nova_compute[192903]: 2025-10-06 14:42:35.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:37 compute-0 nova_compute[192903]: 2025-10-06 14:42:37.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:37 compute-0 podman[235558]: 2025-10-06 14:42:37.209637894 +0000 UTC m=+0.075524409 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc.)
Oct 06 14:42:40 compute-0 nova_compute[192903]: 2025-10-06 14:42:40.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:42 compute-0 nova_compute[192903]: 2025-10-06 14:42:42.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:45 compute-0 nova_compute[192903]: 2025-10-06 14:42:45.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:47 compute-0 nova_compute[192903]: 2025-10-06 14:42:47.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:50 compute-0 nova_compute[192903]: 2025-10-06 14:42:50.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:52 compute-0 nova_compute[192903]: 2025-10-06 14:42:52.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:53 compute-0 podman[235581]: 2025-10-06 14:42:53.197828472 +0000 UTC m=+0.055261338 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:42:53 compute-0 podman[235580]: 2025-10-06 14:42:53.199668731 +0000 UTC m=+0.062875111 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 06 14:42:53 compute-0 podman[235587]: 2025-10-06 14:42:53.229749995 +0000 UTC m=+0.083613026 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 06 14:42:53 compute-0 podman[235579]: 2025-10-06 14:42:53.267786691 +0000 UTC m=+0.135737348 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 06 14:42:55 compute-0 nova_compute[192903]: 2025-10-06 14:42:55.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:57 compute-0 nova_compute[192903]: 2025-10-06 14:42:57.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:42:59 compute-0 podman[203308]: time="2025-10-06T14:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:42:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:42:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 06 14:43:00 compute-0 nova_compute[192903]: 2025-10-06 14:43:00.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:01 compute-0 openstack_network_exporter[205500]: ERROR   14:43:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:43:01 compute-0 openstack_network_exporter[205500]: ERROR   14:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:43:01 compute-0 openstack_network_exporter[205500]: ERROR   14:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:43:01 compute-0 openstack_network_exporter[205500]: ERROR   14:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:43:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:43:01 compute-0 openstack_network_exporter[205500]: ERROR   14:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:43:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:43:02 compute-0 nova_compute[192903]: 2025-10-06 14:43:02.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:04 compute-0 podman[235664]: 2025-10-06 14:43:04.234723999 +0000 UTC m=+0.101718839 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930)
Oct 06 14:43:05 compute-0 nova_compute[192903]: 2025-10-06 14:43:05.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:07 compute-0 nova_compute[192903]: 2025-10-06 14:43:07.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:08 compute-0 podman[235685]: 2025-10-06 14:43:08.238162502 +0000 UTC m=+0.094897907 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Oct 06 14:43:10 compute-0 nova_compute[192903]: 2025-10-06 14:43:10.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:43:11.434 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:43:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:43:11.434 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:43:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:43:11.434 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:43:12 compute-0 nova_compute[192903]: 2025-10-06 14:43:12.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:13 compute-0 nova_compute[192903]: 2025-10-06 14:43:13.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:43:15 compute-0 nova_compute[192903]: 2025-10-06 14:43:15.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:17 compute-0 nova_compute[192903]: 2025-10-06 14:43:17.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:17 compute-0 nova_compute[192903]: 2025-10-06 14:43:17.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.099 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.100 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.101 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.311 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.314 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.340 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.341 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=73.29302978515625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.342 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:43:18 compute-0 nova_compute[192903]: 2025-10-06 14:43:18.343 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:43:19 compute-0 nova_compute[192903]: 2025-10-06 14:43:19.407 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:43:19 compute-0 nova_compute[192903]: 2025-10-06 14:43:19.407 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:43:18 up  1:44,  0 user,  load average: 0.05, 0.21, 0.22\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:43:19 compute-0 nova_compute[192903]: 2025-10-06 14:43:19.435 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing inventories for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 06 14:43:19 compute-0 nova_compute[192903]: 2025-10-06 14:43:19.463 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating ProviderTree inventory for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 06 14:43:19 compute-0 nova_compute[192903]: 2025-10-06 14:43:19.464 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Updating inventory in ProviderTree for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 06 14:43:19 compute-0 nova_compute[192903]: 2025-10-06 14:43:19.485 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing aggregate associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 06 14:43:19 compute-0 nova_compute[192903]: 2025-10-06 14:43:19.542 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Refreshing trait associations for resource provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_F16C,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOUND_MODEL_USB,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX2,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_ARCH_X86_64,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_EXTEND,COMPUTE_ACCELERATORS,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_STATELESS_FIRMWARE,HW_CPU_X86_MMX,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SECURITY_TPM_CRB,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_CLMUL,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 06 14:43:19 compute-0 nova_compute[192903]: 2025-10-06 14:43:19.571 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:43:20 compute-0 nova_compute[192903]: 2025-10-06 14:43:20.080 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:43:20 compute-0 nova_compute[192903]: 2025-10-06 14:43:20.592 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:43:20 compute-0 nova_compute[192903]: 2025-10-06 14:43:20.593 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.250s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:43:20 compute-0 nova_compute[192903]: 2025-10-06 14:43:20.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:21 compute-0 nova_compute[192903]: 2025-10-06 14:43:21.590 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:43:22 compute-0 nova_compute[192903]: 2025-10-06 14:43:22.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:24 compute-0 podman[235713]: 2025-10-06 14:43:24.226704748 +0000 UTC m=+0.064654178 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:43:24 compute-0 podman[235712]: 2025-10-06 14:43:24.255048966 +0000 UTC m=+0.093369426 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 06 14:43:24 compute-0 podman[235711]: 2025-10-06 14:43:24.256035932 +0000 UTC m=+0.101388290 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 06 14:43:24 compute-0 podman[235710]: 2025-10-06 14:43:24.293851712 +0000 UTC m=+0.145145019 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 06 14:43:25 compute-0 nova_compute[192903]: 2025-10-06 14:43:25.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:43:25 compute-0 nova_compute[192903]: 2025-10-06 14:43:25.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:43:25 compute-0 nova_compute[192903]: 2025-10-06 14:43:25.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:27 compute-0 nova_compute[192903]: 2025-10-06 14:43:27.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:27 compute-0 nova_compute[192903]: 2025-10-06 14:43:27.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:43:29 compute-0 podman[203308]: time="2025-10-06T14:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:43:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:43:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Oct 06 14:43:30 compute-0 nova_compute[192903]: 2025-10-06 14:43:30.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:43:30 compute-0 nova_compute[192903]: 2025-10-06 14:43:30.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:31 compute-0 openstack_network_exporter[205500]: ERROR   14:43:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:43:31 compute-0 openstack_network_exporter[205500]: ERROR   14:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:43:31 compute-0 openstack_network_exporter[205500]: ERROR   14:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:43:31 compute-0 openstack_network_exporter[205500]: ERROR   14:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:43:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:43:31 compute-0 openstack_network_exporter[205500]: ERROR   14:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:43:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:43:32 compute-0 nova_compute[192903]: 2025-10-06 14:43:32.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:32 compute-0 nova_compute[192903]: 2025-10-06 14:43:32.584 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:43:32 compute-0 nova_compute[192903]: 2025-10-06 14:43:32.585 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:43:35 compute-0 podman[235792]: 2025-10-06 14:43:35.229888285 +0000 UTC m=+0.087583171 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 06 14:43:35 compute-0 nova_compute[192903]: 2025-10-06 14:43:35.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:36 compute-0 nova_compute[192903]: 2025-10-06 14:43:36.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:43:37 compute-0 nova_compute[192903]: 2025-10-06 14:43:37.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:39 compute-0 podman[235812]: 2025-10-06 14:43:39.19792834 +0000 UTC m=+0.063369364 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 06 14:43:40 compute-0 nova_compute[192903]: 2025-10-06 14:43:40.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:42 compute-0 nova_compute[192903]: 2025-10-06 14:43:42.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:45 compute-0 nova_compute[192903]: 2025-10-06 14:43:45.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:47 compute-0 nova_compute[192903]: 2025-10-06 14:43:47.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:50 compute-0 nova_compute[192903]: 2025-10-06 14:43:50.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:43:52.066 104072 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'b2:c7:40', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'e6:b1:2e:5e:6c:e8'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 06 14:43:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:43:52.067 104072 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 06 14:43:52 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:43:52.067 104072 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=6cb79b8b-7bef-432f-9e10-9690a1ce5aa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 06 14:43:52 compute-0 nova_compute[192903]: 2025-10-06 14:43:52.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:52 compute-0 nova_compute[192903]: 2025-10-06 14:43:52.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:55 compute-0 podman[235837]: 2025-10-06 14:43:55.225926191 +0000 UTC m=+0.072297912 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:43:55 compute-0 podman[235836]: 2025-10-06 14:43:55.245465084 +0000 UTC m=+0.087098169 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:43:55 compute-0 podman[235834]: 2025-10-06 14:43:55.249533212 +0000 UTC m=+0.102391687 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:43:55 compute-0 podman[235835]: 2025-10-06 14:43:55.258782929 +0000 UTC m=+0.103488076 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Oct 06 14:43:55 compute-0 nova_compute[192903]: 2025-10-06 14:43:55.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:57 compute-0 nova_compute[192903]: 2025-10-06 14:43:57.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:43:59 compute-0 podman[203308]: time="2025-10-06T14:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:43:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:43:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 06 14:44:00 compute-0 nova_compute[192903]: 2025-10-06 14:44:00.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:01 compute-0 openstack_network_exporter[205500]: ERROR   14:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:44:01 compute-0 openstack_network_exporter[205500]: ERROR   14:44:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:44:01 compute-0 openstack_network_exporter[205500]: ERROR   14:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:44:01 compute-0 openstack_network_exporter[205500]: ERROR   14:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:44:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:44:01 compute-0 openstack_network_exporter[205500]: ERROR   14:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:44:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:44:02 compute-0 nova_compute[192903]: 2025-10-06 14:44:02.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:05 compute-0 nova_compute[192903]: 2025-10-06 14:44:05.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:06 compute-0 podman[235921]: 2025-10-06 14:44:06.00497972 +0000 UTC m=+0.082269139 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 06 14:44:07 compute-0 nova_compute[192903]: 2025-10-06 14:44:07.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:10 compute-0 podman[235942]: 2025-10-06 14:44:10.214944491 +0000 UTC m=+0.079351301 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, architecture=x86_64, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 06 14:44:10 compute-0 nova_compute[192903]: 2025-10-06 14:44:10.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:44:11.435 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:44:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:44:11.436 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:44:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:44:11.436 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:44:12 compute-0 nova_compute[192903]: 2025-10-06 14:44:12.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:15 compute-0 nova_compute[192903]: 2025-10-06 14:44:15.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:15 compute-0 nova_compute[192903]: 2025-10-06 14:44:15.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:17 compute-0 nova_compute[192903]: 2025-10-06 14:44:17.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:19 compute-0 nova_compute[192903]: 2025-10-06 14:44:19.578 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:19 compute-0 nova_compute[192903]: 2025-10-06 14:44:19.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.100 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.101 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.258 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.259 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.292 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.293 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5838MB free_disk=73.29310607910156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.293 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.293 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:44:20 compute-0 nova_compute[192903]: 2025-10-06 14:44:20.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:21 compute-0 nova_compute[192903]: 2025-10-06 14:44:21.372 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:44:21 compute-0 nova_compute[192903]: 2025-10-06 14:44:21.372 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:44:20 up  1:45,  0 user,  load average: 0.04, 0.18, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:44:21 compute-0 nova_compute[192903]: 2025-10-06 14:44:21.392 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:44:21 compute-0 nova_compute[192903]: 2025-10-06 14:44:21.899 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:44:22 compute-0 nova_compute[192903]: 2025-10-06 14:44:22.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:22 compute-0 nova_compute[192903]: 2025-10-06 14:44:22.410 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:44:22 compute-0 nova_compute[192903]: 2025-10-06 14:44:22.410 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:44:25 compute-0 nova_compute[192903]: 2025-10-06 14:44:25.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:26 compute-0 podman[235967]: 2025-10-06 14:44:26.216237159 +0000 UTC m=+0.054943519 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Oct 06 14:44:26 compute-0 podman[235966]: 2025-10-06 14:44:26.248713737 +0000 UTC m=+0.081801847 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 06 14:44:26 compute-0 podman[235968]: 2025-10-06 14:44:26.248771288 +0000 UTC m=+0.070604347 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 06 14:44:26 compute-0 podman[235965]: 2025-10-06 14:44:26.275950674 +0000 UTC m=+0.115770304 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4)
Oct 06 14:44:27 compute-0 nova_compute[192903]: 2025-10-06 14:44:27.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:27 compute-0 nova_compute[192903]: 2025-10-06 14:44:27.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:27 compute-0 nova_compute[192903]: 2025-10-06 14:44:27.581 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:44:27 compute-0 nova_compute[192903]: 2025-10-06 14:44:27.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:27 compute-0 nova_compute[192903]: 2025-10-06 14:44:27.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 06 14:44:28 compute-0 nova_compute[192903]: 2025-10-06 14:44:28.088 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 06 14:44:29 compute-0 podman[203308]: time="2025-10-06T14:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:44:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:44:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Oct 06 14:44:30 compute-0 nova_compute[192903]: 2025-10-06 14:44:30.089 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:30 compute-0 nova_compute[192903]: 2025-10-06 14:44:30.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:31 compute-0 openstack_network_exporter[205500]: ERROR   14:44:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:44:31 compute-0 openstack_network_exporter[205500]: ERROR   14:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:44:31 compute-0 openstack_network_exporter[205500]: ERROR   14:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:44:31 compute-0 openstack_network_exporter[205500]: ERROR   14:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:44:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:44:31 compute-0 openstack_network_exporter[205500]: ERROR   14:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:44:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:44:32 compute-0 nova_compute[192903]: 2025-10-06 14:44:32.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:33 compute-0 nova_compute[192903]: 2025-10-06 14:44:33.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:33 compute-0 nova_compute[192903]: 2025-10-06 14:44:33.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:35 compute-0 nova_compute[192903]: 2025-10-06 14:44:35.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:36 compute-0 podman[236047]: 2025-10-06 14:44:36.248317278 +0000 UTC m=+0.101191804 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid)
Oct 06 14:44:37 compute-0 nova_compute[192903]: 2025-10-06 14:44:37.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:37 compute-0 nova_compute[192903]: 2025-10-06 14:44:37.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:40 compute-0 nova_compute[192903]: 2025-10-06 14:44:40.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:41 compute-0 podman[236068]: 2025-10-06 14:44:41.205928837 +0000 UTC m=+0.069038956 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 06 14:44:42 compute-0 nova_compute[192903]: 2025-10-06 14:44:42.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:45 compute-0 nova_compute[192903]: 2025-10-06 14:44:45.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:47 compute-0 nova_compute[192903]: 2025-10-06 14:44:47.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:47 compute-0 nova_compute[192903]: 2025-10-06 14:44:47.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:44:50 compute-0 nova_compute[192903]: 2025-10-06 14:44:50.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:52 compute-0 nova_compute[192903]: 2025-10-06 14:44:52.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:55 compute-0 nova_compute[192903]: 2025-10-06 14:44:55.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:57 compute-0 nova_compute[192903]: 2025-10-06 14:44:57.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:44:57 compute-0 podman[236092]: 2025-10-06 14:44:57.233853699 +0000 UTC m=+0.081567831 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 06 14:44:57 compute-0 podman[236090]: 2025-10-06 14:44:57.234022603 +0000 UTC m=+0.088692640 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 06 14:44:57 compute-0 podman[236091]: 2025-10-06 14:44:57.241210095 +0000 UTC m=+0.087778256 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 06 14:44:57 compute-0 podman[236089]: 2025-10-06 14:44:57.287488252 +0000 UTC m=+0.144314147 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 06 14:44:59 compute-0 podman[203308]: time="2025-10-06T14:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:44:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:44:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Oct 06 14:45:00 compute-0 nova_compute[192903]: 2025-10-06 14:45:00.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:01 compute-0 openstack_network_exporter[205500]: ERROR   14:45:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:45:01 compute-0 openstack_network_exporter[205500]: ERROR   14:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:45:01 compute-0 openstack_network_exporter[205500]: ERROR   14:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:45:01 compute-0 openstack_network_exporter[205500]: ERROR   14:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:45:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:45:01 compute-0 openstack_network_exporter[205500]: ERROR   14:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:45:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:45:02 compute-0 nova_compute[192903]: 2025-10-06 14:45:02.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:05 compute-0 nova_compute[192903]: 2025-10-06 14:45:05.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:07 compute-0 nova_compute[192903]: 2025-10-06 14:45:07.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:07 compute-0 podman[236175]: 2025-10-06 14:45:07.262889336 +0000 UTC m=+0.121153798 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 06 14:45:08 compute-0 nova_compute[192903]: 2025-10-06 14:45:08.087 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:08 compute-0 nova_compute[192903]: 2025-10-06 14:45:08.088 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 06 14:45:10 compute-0 nova_compute[192903]: 2025-10-06 14:45:10.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:45:11.437 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:45:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:45:11.437 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:45:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:45:11.437 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:45:12 compute-0 nova_compute[192903]: 2025-10-06 14:45:12.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:12 compute-0 podman[236196]: 2025-10-06 14:45:12.240308623 +0000 UTC m=+0.088135086 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git)
Oct 06 14:45:15 compute-0 nova_compute[192903]: 2025-10-06 14:45:15.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:17 compute-0 nova_compute[192903]: 2025-10-06 14:45:17.092 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:17 compute-0 nova_compute[192903]: 2025-10-06 14:45:17.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:20 compute-0 nova_compute[192903]: 2025-10-06 14:45:20.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:20 compute-0 nova_compute[192903]: 2025-10-06 14:45:20.580 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:20 compute-0 nova_compute[192903]: 2025-10-06 14:45:20.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.100 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.101 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.102 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.302 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.304 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.322 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.323 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5844MB free_disk=73.29308700561523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.323 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:45:21 compute-0 nova_compute[192903]: 2025-10-06 14:45:21.324 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:45:22 compute-0 nova_compute[192903]: 2025-10-06 14:45:22.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:22 compute-0 nova_compute[192903]: 2025-10-06 14:45:22.380 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:45:22 compute-0 nova_compute[192903]: 2025-10-06 14:45:22.381 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:45:21 up  1:46,  0 user,  load average: 0.07, 0.16, 0.19\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:45:22 compute-0 nova_compute[192903]: 2025-10-06 14:45:22.401 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:45:22 compute-0 nova_compute[192903]: 2025-10-06 14:45:22.910 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:45:23 compute-0 nova_compute[192903]: 2025-10-06 14:45:23.423 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:45:23 compute-0 nova_compute[192903]: 2025-10-06 14:45:23.424 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:45:25 compute-0 nova_compute[192903]: 2025-10-06 14:45:25.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:27 compute-0 nova_compute[192903]: 2025-10-06 14:45:27.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:28 compute-0 podman[236224]: 2025-10-06 14:45:28.193834713 +0000 UTC m=+0.049827392 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 06 14:45:28 compute-0 podman[236219]: 2025-10-06 14:45:28.204637382 +0000 UTC m=+0.066641002 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 06 14:45:28 compute-0 podman[236226]: 2025-10-06 14:45:28.207765665 +0000 UTC m=+0.059620094 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 06 14:45:28 compute-0 podman[236218]: 2025-10-06 14:45:28.231328625 +0000 UTC m=+0.100083145 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Oct 06 14:45:28 compute-0 nova_compute[192903]: 2025-10-06 14:45:28.425 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:28 compute-0 nova_compute[192903]: 2025-10-06 14:45:28.425 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:45:29 compute-0 nova_compute[192903]: 2025-10-06 14:45:29.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:29 compute-0 podman[203308]: time="2025-10-06T14:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:45:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:45:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 06 14:45:30 compute-0 nova_compute[192903]: 2025-10-06 14:45:30.577 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:30 compute-0 nova_compute[192903]: 2025-10-06 14:45:30.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:31 compute-0 openstack_network_exporter[205500]: ERROR   14:45:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:45:31 compute-0 openstack_network_exporter[205500]: ERROR   14:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:45:31 compute-0 openstack_network_exporter[205500]: ERROR   14:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:45:31 compute-0 openstack_network_exporter[205500]: ERROR   14:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:45:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:45:31 compute-0 openstack_network_exporter[205500]: ERROR   14:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:45:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:45:32 compute-0 nova_compute[192903]: 2025-10-06 14:45:32.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:34 compute-0 nova_compute[192903]: 2025-10-06 14:45:34.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:35 compute-0 nova_compute[192903]: 2025-10-06 14:45:35.583 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:35 compute-0 nova_compute[192903]: 2025-10-06 14:45:35.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:37 compute-0 nova_compute[192903]: 2025-10-06 14:45:37.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:38 compute-0 podman[236302]: 2025-10-06 14:45:38.227942366 +0000 UTC m=+0.085688060 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4)
Oct 06 14:45:38 compute-0 nova_compute[192903]: 2025-10-06 14:45:38.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:45:40 compute-0 nova_compute[192903]: 2025-10-06 14:45:40.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:42 compute-0 nova_compute[192903]: 2025-10-06 14:45:42.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:43 compute-0 podman[236324]: 2025-10-06 14:45:43.236438325 +0000 UTC m=+0.093980593 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 06 14:45:45 compute-0 nova_compute[192903]: 2025-10-06 14:45:45.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:47 compute-0 nova_compute[192903]: 2025-10-06 14:45:47.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:50 compute-0 nova_compute[192903]: 2025-10-06 14:45:50.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:52 compute-0 nova_compute[192903]: 2025-10-06 14:45:52.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:55 compute-0 nova_compute[192903]: 2025-10-06 14:45:55.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:57 compute-0 nova_compute[192903]: 2025-10-06 14:45:57.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:45:59 compute-0 podman[236348]: 2025-10-06 14:45:59.222178356 +0000 UTC m=+0.069156008 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 06 14:45:59 compute-0 podman[236347]: 2025-10-06 14:45:59.232877732 +0000 UTC m=+0.080097751 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 06 14:45:59 compute-0 podman[236354]: 2025-10-06 14:45:59.256949585 +0000 UTC m=+0.094845855 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:45:59 compute-0 podman[236346]: 2025-10-06 14:45:59.277088474 +0000 UTC m=+0.138938614 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Oct 06 14:45:59 compute-0 podman[203308]: time="2025-10-06T14:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:45:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:45:59 compute-0 podman[203308]: @ - - [06/Oct/2025:14:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Oct 06 14:46:00 compute-0 nova_compute[192903]: 2025-10-06 14:46:00.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:01 compute-0 openstack_network_exporter[205500]: ERROR   14:46:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:46:01 compute-0 openstack_network_exporter[205500]: ERROR   14:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:46:01 compute-0 openstack_network_exporter[205500]: ERROR   14:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:46:01 compute-0 openstack_network_exporter[205500]: ERROR   14:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:46:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:46:01 compute-0 openstack_network_exporter[205500]: ERROR   14:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:46:01 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:46:02 compute-0 nova_compute[192903]: 2025-10-06 14:46:02.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:05 compute-0 nova_compute[192903]: 2025-10-06 14:46:05.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:07 compute-0 nova_compute[192903]: 2025-10-06 14:46:07.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:09 compute-0 podman[236429]: 2025-10-06 14:46:09.218672086 +0000 UTC m=+0.077945814 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 06 14:46:10 compute-0 nova_compute[192903]: 2025-10-06 14:46:10.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:46:11.438 104072 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:46:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:46:11.439 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:46:11 compute-0 ovn_metadata_agent[104057]: 2025-10-06 14:46:11.439 104072 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:46:12 compute-0 nova_compute[192903]: 2025-10-06 14:46:12.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:14 compute-0 podman[236451]: 2025-10-06 14:46:14.208222437 +0000 UTC m=+0.070523686 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container)
Oct 06 14:46:15 compute-0 nova_compute[192903]: 2025-10-06 14:46:15.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:16 compute-0 nova_compute[192903]: 2025-10-06 14:46:16.587 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:46:17 compute-0 nova_compute[192903]: 2025-10-06 14:46:17.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:20 compute-0 nova_compute[192903]: 2025-10-06 14:46:20.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:46:20 compute-0 nova_compute[192903]: 2025-10-06 14:46:20.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.107 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.108 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.108 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.108 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.300 2 WARNING nova.virt.libvirt.driver [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.302 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.331 2 DEBUG oslo_concurrency.processutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.332 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5836MB free_disk=73.29310607910156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.333 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 06 14:46:21 compute-0 nova_compute[192903]: 2025-10-06 14:46:21.333 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 06 14:46:22 compute-0 nova_compute[192903]: 2025-10-06 14:46:22.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:22 compute-0 nova_compute[192903]: 2025-10-06 14:46:22.445 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 06 14:46:22 compute-0 nova_compute[192903]: 2025-10-06 14:46:22.446 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:46:21 up  1:47,  0 user,  load average: 0.02, 0.13, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 06 14:46:22 compute-0 nova_compute[192903]: 2025-10-06 14:46:22.539 2 DEBUG nova.compute.provider_tree [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed in ProviderTree for provider: 603c9dc2-ee32-4e36-82be-dcfb995e2be1 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 06 14:46:23 compute-0 nova_compute[192903]: 2025-10-06 14:46:23.048 2 DEBUG nova.scheduler.client.report [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Inventory has not changed for provider 603c9dc2-ee32-4e36-82be-dcfb995e2be1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 06 14:46:23 compute-0 nova_compute[192903]: 2025-10-06 14:46:23.561 2 DEBUG nova.compute.resource_tracker [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 06 14:46:23 compute-0 nova_compute[192903]: 2025-10-06 14:46:23.562 2 DEBUG oslo_concurrency.lockutils [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.229s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 06 14:46:25 compute-0 nova_compute[192903]: 2025-10-06 14:46:25.559 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:46:25 compute-0 nova_compute[192903]: 2025-10-06 14:46:25.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:46:25 compute-0 nova_compute[192903]: 2025-10-06 14:46:25.582 2 DEBUG nova.compute.manager [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 06 14:46:25 compute-0 nova_compute[192903]: 2025-10-06 14:46:25.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:27 compute-0 nova_compute[192903]: 2025-10-06 14:46:27.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:29 compute-0 nova_compute[192903]: 2025-10-06 14:46:29.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:46:29 compute-0 podman[203308]: time="2025-10-06T14:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 06 14:46:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19540 "" "Go-http-client/1.1"
Oct 06 14:46:29 compute-0 podman[203308]: @ - - [06/Oct/2025:14:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 06 14:46:30 compute-0 podman[236477]: 2025-10-06 14:46:30.237231895 +0000 UTC m=+0.077768789 container health_status d8b706f5be6e0f566cc44a23dd5a24173383b10f4f72432c2e682fd815bd41c9 (image=38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 06 14:46:30 compute-0 podman[236476]: 2025-10-06 14:46:30.24863983 +0000 UTC m=+0.092996306 container health_status afefbf3f36f19699219265edd4303c27423f3424dab902dafaa3a25af01ab6d9 (image=38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 06 14:46:30 compute-0 podman[236478]: 2025-10-06 14:46:30.253007767 +0000 UTC m=+0.084746736 container health_status fccbf7604b643a8d70f11374d308b8ea3123da090a806af43728184fd5ae9e12 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 06 14:46:30 compute-0 podman[236475]: 2025-10-06 14:46:30.268421449 +0000 UTC m=+0.118655312 container health_status 20a8f3a24333808f4e8b14f0014273d89e004e61d7cc09d7b128d7d6c5166c5b (image=38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 06 14:46:30 compute-0 nova_compute[192903]: 2025-10-06 14:46:30.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:31 compute-0 openstack_network_exporter[205500]: ERROR   14:46:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 06 14:46:31 compute-0 openstack_network_exporter[205500]: ERROR   14:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:46:31 compute-0 openstack_network_exporter[205500]: ERROR   14:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 06 14:46:31 compute-0 openstack_network_exporter[205500]: ERROR   14:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 06 14:46:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:46:31 compute-0 openstack_network_exporter[205500]: ERROR   14:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 06 14:46:31 compute-0 openstack_network_exporter[205500]: 
Oct 06 14:46:32 compute-0 nova_compute[192903]: 2025-10-06 14:46:32.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:35 compute-0 nova_compute[192903]: 2025-10-06 14:46:35.582 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:46:35 compute-0 nova_compute[192903]: 2025-10-06 14:46:35.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:35 compute-0 sshd-session[236555]: Accepted publickey for zuul from 192.168.122.10 port 44990 ssh2: ECDSA SHA256:8vdVRH/nJXo48rVn9qMRflP4HlOHXuJisqRafCeYq8Y
Oct 06 14:46:35 compute-0 systemd-logind[789]: New session 38 of user zuul.
Oct 06 14:46:35 compute-0 systemd[1]: Started Session 38 of User zuul.
Oct 06 14:46:35 compute-0 sshd-session[236555]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 06 14:46:36 compute-0 sudo[236559]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 06 14:46:36 compute-0 sudo[236559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 06 14:46:36 compute-0 nova_compute[192903]: 2025-10-06 14:46:36.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:46:37 compute-0 nova_compute[192903]: 2025-10-06 14:46:37.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:38 compute-0 nova_compute[192903]: 2025-10-06 14:46:38.581 2 DEBUG oslo_service.periodic_task [None req-a3efc3b2-132a-45c6-abd4-ebfed549fd7c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 06 14:46:40 compute-0 podman[236705]: 2025-10-06 14:46:40.210095972 +0000 UTC m=+0.076344941 container health_status b3a722e2d2fc8a946cb8cce08f7747c833ecc7e968f9eff07d50bc6dc46e3480 (image=38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.151:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 06 14:46:40 compute-0 ovs-vsctl[236751]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 06 14:46:40 compute-0 nova_compute[192903]: 2025-10-06 14:46:40.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:41 compute-0 virtqemud[192802]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 06 14:46:41 compute-0 virtqemud[192802]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 06 14:46:41 compute-0 virtqemud[192802]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 06 14:46:42 compute-0 nova_compute[192903]: 2025-10-06 14:46:42.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:42 compute-0 crontab[237166]: (root) LIST (root)
Oct 06 14:46:44 compute-0 systemd[1]: Starting Hostname Service...
Oct 06 14:46:45 compute-0 podman[237291]: 2025-10-06 14:46:45.002294551 +0000 UTC m=+0.059199153 container health_status 105e21466dfc6c0269067045656dd023b8857b9e2f94ea0f9db0872b534b8e79 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, vcs-type=git)
Oct 06 14:46:45 compute-0 systemd[1]: Started Hostname Service.
Oct 06 14:46:45 compute-0 nova_compute[192903]: 2025-10-06 14:46:45.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 06 14:46:47 compute-0 nova_compute[192903]: 2025-10-06 14:46:47.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
